Modern Intel Gallium3D Driver Still Being Toyed With
Phoronix: Modern Intel Gallium3D Driver Still Being Toyed With
While it's not the default Linux graphics driver for Sandy Bridge or Ivy Bridge hardware, the "ilo" independently-developed Gallium3D driver for modern Intel graphics hardware continues to be developed...
How nice to have a driver named "happiness" or "joy". Though that does put up some high expectations :P
Where is the comments about missing features and poor performance and the insistence that only the classic driver be used?
Oh never mind I get it. Bias only works one way.
And how come We have never seen an Intel APU vs AMD APU comparison? Somehow I doubt that will ever happen on this site. Oh sure there have been some that included benchmarks of both, but not t a single one of them included graphics benches. Which is arguably the most important part of an APU. Why the hell else would you buy an APU?
Either way. I don't care if you take a negative bias towards everything or if you take a positive bias towards everything. Just pick a damn side and be fair to that side.
Review of the 5800k No Intel benches, but includes graphics.
Review of the 3870k Intel Benches, but no graphics.
If that isnt completely obvious. I mean come on. This is ridiculous.
Last edited by duby229; 05-18-2013 at 11:50 AM.
Originally Posted by duby229
There are benchmarks that show that Intel HD4000 is as fast as an Radeon 6500-600 Apu with 500Gflops and Catalyst. Any way we have not a reason to test a product (Catalyst) that steal to any Benchmark, reducing precision. I just don't understand how from this: http://www.xbitlabs.com/articles/gra...k_5.html#sect3 that Unigine shows the truth (gtx580=70% more Flops than HD6970, the same is with HD7000-3.8Tflops vs 6.4Tflops of a GTX680), now are equal.
Those are high end discrete cards and not APUs. Needless to say Intel doesn't make high end discrete cards.
what s point for this drivers? someone explain?
I don't get Intelís total refusal to support Gallium3D.
Originally Posted by phoronix
I understand that if there is a fine working classic driver for older hardware, the GPU vendor would not want to spend money on a Gallium port (AMD did that as well). But considering that i915g exists already and reportedly is better than the classic driver, to then spend the resources on the classic driver to achieve partial feature parity with the Gallium driver makes no economic sense from my POV. Shouldn't Intelís priority be to make their hardware customers as satisfied as possible to make them want to buy Intel products as well?
Anybody got a clue why Intel insists on not touching Gallium Ė not even for new drivers?
The work required to bump i915 classic to 2.1 was minimal - probably an hour or so - and makes the driver a lot more useful. At this point, we're not really spending any resources on that generation of hardware at all. In fact, these two patches were the only significant work done on the classic driver since March 2012 (yes, last year). So it's not really a refusal of Gallium in this case, just that we think our resources are better spent on newer hardware.
Originally Posted by Awesomeness
For our new hardware, you're right - we are currently refusing to use Gallium. That's a different situation, though.
And what's the reasoning behind this? I googled for a reason but couldn't find any. AMD, Google, Red Hat, VMWare,Ö back Gallium these days and only Intel seems to be missing.
Originally Posted by Kayden
Intel doesn't hate gallium or anything, afiak its just because their current driver is in a good state, very stable and optimized, and they don't view switching to gallium as being worth it for them.
Originally Posted by Awesomeness