Page 1 of 5 123 ... LastLast
Results 1 to 10 of 49

Thread: The Open-Source Linux Graphics Card Showdown

  1. #1
    Join Date
    Jan 2007
    Posts
    15,138

    Default The Open-Source Linux Graphics Card Showdown

    Phoronix: The Open-Source Linux Graphics Card Showdown

    Earlier this week I provided Intel Core i7 3770K Linux benchmarks for the Ivy Bridge launch-day followed by initial Ivy Bridge Linux HD 4000 graphics benchmarks compared to the Intel HD 2000/3000 Sandy Bridge graphics under Linux and to AMD Fusion on Catalyst and Gallium3D. In this article are more benchmarks of the HD 4000 Ivy Bridge graphics under Linux with Intel's open-source driver, but in this article it is a much larger comparison. This is a full showdown of the Core i7 3770K graphics compared to several discrete NVIDIA GeForce and AMD Radeon graphics cards when they're using their respective open-source Gallium3D drivers. What graphics hardware is best if you want to use an open-source GPU driver? Find out now.

    http://www.phoronix.com/vr.php?view=17299

  2. #2
    Join Date
    Oct 2008
    Posts
    3,176

    Default Overall a pretty poor showing by the radeon drivers

    Hate to say it, but there's no way Ivy Bridge should be putting up comparable numbers.

    I wish there were 20 more developers working on optimizing it.

    I also wish there were a few more tests here with heavier shader usage - WINE, Unigine, etc. Ivy Bridge is limited by the shader power it has, so simple Quake 3 games don't tell us much.

  3. #3
    Join Date
    Mar 2010
    Posts
    18

    Default

    Nice test, but I think it's near impossible to tell which curve corresponds to which graphics card in those graphs. The colors are way too similar.

    I didn't read all the text so maybe you mentioned it, but why only low and mid range cards? Would've been interesting to see some ATI HD 7870 or even 79x0 (or similar from previous generation) since it's games benchmarking

  4. #4
    Join Date
    Jan 2012
    Location
    Italy
    Posts
    52

    Default

    Quote Originally Posted by johanar View Post
    Nice test, but I think it's near impossible to tell which curve corresponds to which graphics card in those graphs. The colors are way too similar.

    I didn't read all the text so maybe you mentioned it, but why only low and mid range cards? Would've been interesting to see some ATI HD 7870 or even 79x0 (or similar from previous generation) since it's games benchmarking
    radeonsi is not ready, so 78X0 and 79X0 don't have a free open source driver to test.

  5. #5
    Join Date
    Nov 2009
    Posts
    379

    Default

    Thanks for the test, just yesterday I was searching for those numbers intel vs. radeon vs. nouveau

    It's nice to see the radeon driver win in some of the tests (even though power consumption is no good).
    For me this looks like described in this comic:
    http://xkcd.com/644/
    at least I got the hardware already...

  6. #6

    Default

    Finally some Doom 3 results and line graphs that don't require a ruler and a calculator to read! So why leave out the HD6550D?

    Will we ever see any closed source games being tested? How about Prey? IIRC it does have a demo mode.

    When you finally get an A10-5800K you'll need to test it against the i7-3770K A8-3870K and the i5-2500K else it's not really a fair test of the new generation and the generation it's replacing.

    Also invest in some good ram, overclocking tests on these CPU graphics systems would be very welcome.

  7. #7
    Join Date
    Feb 2008
    Location
    Linuxland
    Posts
    5,187

    Default

    Thanks, these are some good tests with the tweaked settings.

    I'll too echo the request for a re-test of the Llano with it's high power mode.

  8. #8
    Join Date
    Apr 2012
    Posts
    1

    Default

    Some AMD fans don't care about actual test results. If Michael tests with default settings, they cry, "Turn on all unstable beta features!". If Michael tweaks xorg.conf, amd fans cry, "Test better amd cards!". If Michael puts better cards, fans cry, "There is a buggy git branch, that may boost perfomance. Test with it! It won't be enabled by default for the next 5 years, but who cares? " Seems like all they need is an excuse to see amd winning.

    Maybe Michael should make up results where radeon wins by 10x times? ;D

    Anyway, good job, Intel guys! You've made the best Linux graphic driver! It's open-source, fast and takes full advantage of hardware!

    My next system will be based on Ivy Bridge. I don't play heavy 3D games, all I need is a system with decent 2D perfomance and good single-threaded capabilities that would be useful for the next 10 years.

  9. #9
    Join Date
    Nov 2009
    Posts
    379

    Default

    Quote Originally Posted by rhier View Post
    Some AMD fans don't care about actual test results. If Michael tests with default settings, they cry, "Turn on all unstable beta features!". If Michael tweaks xorg.conf, amd fans cry, "Test better amd cards!". If Michael puts better cards, fans cry, "There is a buggy git branch, that may boost perfomance. Test with it! It won't be enabled by default for the next 5 years, but who cares? " Seems like all they need is an excuse to see amd winning.

    Maybe Michael should make up results where radeon wins by 10x times? ;D

    Anyway, good job, Intel guys! You've made the best Linux graphic driver! It's open-source, fast and takes full advantage of hardware!

    My next system will be based on Ivy Bridge. I don't play heavy 3D games, all I need is a system with decent 2D perfomance and good single-threaded capabilities that would be useful for the next 10 years.
    But if you have all the hardware capabilities already supported and it is still only a fraction of the binary drivers performance of the other hardware, what is there to gain in the future? In ~4 years time, when my current AMD card is obsolete, it will be fully supported by the open source driver and might have the performance of the current binary drivers

    I don't really game myself, but can confirm that the radeon driver is good for multiply displays, watching video, have applications switch between fullscreen mode and normal and NOT spread across all monitors but only use one or mess up the whole desktop.

    And still I can play some 3D games with enough performance, where there is still progress and permanent improvement.

  10. #10

    Default

    Quote Originally Posted by rhier View Post
    Some AMD fans don't care about actual test results. If Michael tests with default settings, they cry, "Turn on all unstable beta features!". If Michael tweaks xorg.conf, amd fans cry, "Test better amd cards!". If Michael puts better cards, fans cry, "There is a buggy git branch, that may boost perfomance. Test with it! It won't be enabled by default for the next 5 years, but who cares? " Seems like all they need is an excuse to see amd winning.
    Actually since it was a general OSS GPU driver test including higher end cards from AMD and Nvidia makes sense, are they CPU limited? how much difference does PCIe 2.0 make etc...

    On the other hand, the HD6550D is the GPU side of the desktop Llano A8 series APU, its in the same class as the other APUs and CPUs with attached GPU, I specifically picked the fastest GPU equipped models, both the Sandy Bridge i5-2500K/Intel HD Graphics 3000 and Llano A8-3870/Radeon HD6550D of the previous generation and the new Ivy Bridge i7-3770k/Intel HD Graphics 4000 and Trinity A10-5800K/Radeon HD7660D of the current generation.

    Do you not think that testing the best of the previous generation against the best of the current generation directly would make for much more interesting and relevant results?

    The vast majority of users will only ever use the GPU built into the system and never buy a dedicated GPU card.
    Last edited by Kivada; 04-27-2012 at 05:38 AM.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •