Page 1 of 8 123 ... LastLast
Results 1 to 10 of 85

Thread: ATI R300 Mesa, Gallium3D Compared To Catalyst

Hybrid View

  1. #1
    Join Date
    Jan 2007
    Posts
    13,418

    Default ATI R300 Mesa, Gallium3D Compared To Catalyst

    Phoronix: ATI R300 Mesa, Gallium3D Compared To Catalyst

    Last quarter we compared the Catalyst and Mesa driver performance using an ATI Radeon HD 4830 graphics card, compared the Gallium3D and classic Mesa drivers for ATI Radeon X1000 series hardware, and ultimately found that even with the ATI R500 class graphics cards the open-source driver is still playing catch-up to AMD's proprietary Catalyst Linux driver. In this article we have similar tests to show the performance disparity with ATI's much older R300 class hardware. Even with Radeon hardware that has had open-source support much longer, their drivers are not nearly as mature as an outdated Catalyst driver in the same configuration.

    http://www.phoronix.com/vr.php?view=15116

  2. #2
    Join Date
    Mar 2009
    Posts
    17

    Default

    Beside not doing better than the tradional mesa stack, as I have told earlier the Gallium 3D is still really unstable, causing lots of freezes (almost daily) with my Mobility Radeon x700 (RV410). Since I rolled back to mesa 7.7, no freezes at all for the last 15 days (uptime). To me, it seems quite obvious which one to use right now...

  3. #3
    Join Date
    Feb 2008
    Location
    Linuxland
    Posts
    4,729

    Default

    Weird how only OA regressed from playable to not. Is this the 50% drop that we also see in glxgears, from the extra copy of DRI2?

  4. #4
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,282

    Default

    My recollection was that one of the attractions of the 300g driver was that it could support a number of apps which did not run on the 300 classic mesa driver. I understand that's hard to show on benchmarks (unless 300c gets a flatline) but worth mentioning.

  5. #5
    Join Date
    Oct 2009
    Location
    West Jordan, Utah, USA
    Posts
    50

    Default

    Its been said before, but man, we have really got to get some non-FPS, and some non-video game 3d benchmarks.

    Not that the results of a few different game engines aren't interesting, but the story that we get from these data is pretty narrow.

  6. #6
    Join Date
    Mar 2008
    Posts
    569

    Default

    So to make it short if you want 3D, you must want FGLRX. If you want FGLRX, it means that you want NVIDIA.


    On the contrary, if you don't want 3D, you want Mesa(and eventually one day Gallium). If you want Mesa, you surely don't want NVIDIA's blob.


    Or did I say something wrong?

  7. #7
    Join Date
    Mar 2008
    Location
    Milan, Italy
    Posts
    100

    Default

    Quote Originally Posted by bulletxt View Post
    So to make it short if you want 3D, you must want FGLRX. If you want FGLRX, it means that you want NVIDIA.


    On the contrary, if you don't want 3D, you want Mesa(and eventually one day Gallium). If you want Mesa, you surely don't want NVIDIA's blob.


    Or did I say something wrong?
    I was thinking exactly the same thing.
    It's sad though that 3D performance on the opensource side is so abysmal... I understand with nVidia, a company who dislikes opensource... but AMD has been thoroughly involved in opensource drivers for years, yet results are far to come (in 3d field, I mean).

    I know that FGLRX has tons of IP they cannot expose, I know it shares lots of code with the windows drivers... but come on, a question rises to me: "is oss ati-driver wrong from the basement?"
    I'm not being polemic. Just asking an opinion...

    I know that it's code under development, I know that efforts are put in giving it features and stable 2D first. But I'm asking to coders involved: is there room for performance improvement once 2D stability is reached and gallium3D has matured? I mean: now blobs are 3 to 5 times faster! Can we expect oss drivers to be let's say 60% as fast as fglrx in teh next 18 months? or is it wishful thinking?

  8. #8
    Join Date
    Jan 2009
    Posts
    598

    Default

    Quote Originally Posted by TeoLinuX View Post
    is there room for performance improvement once 2D stability is reached and gallium3D has matured?
    There is, namely:
    - color tiling (implemented but seems to be disabled in this article, WHY???)
    - Hyper-Z
    - compiler optimizations
    - DRI2 Swap
    - reducing CPU overhead everywhere

    Michael should really be running xf86-video-ati and libdrm from git and latest kernel-rcX, otherwise the perfomance will continue to suck in his articles.

    There are performance gains and there will be more as time goes by, but the majority of them will be disabled with old kernels and DDX drivers.

    In other words, another crappy article.

  9. #9
    Join Date
    Aug 2009
    Posts
    2,264

    Default

    I don't understand all these complaints. It is not like "Hey the driver for card x sucks compared to driver y"... It is more like "The architecture for drivers to fit in isn't even done yet.".

    The beauty of Gallium is that a shitload of stuff that was normaly needed to be done for each card is now card and driver agnostic [is this correct english?].

    I'd say use whatever works for you now because the foundation for the drivers to implement themselves in isn't finnished and is the primary concern today. Hold your cool and massive respect to AMD for their part of the support for laying this new architecture and ofcourse also everyone else.

  10. #10
    Join Date
    Nov 2009
    Posts
    328

    Default

    Using the latest kernel lets say 2.6.35.7 and xorg edgers (ubuntu PPA) really leads to be totally desync with the latest development features / performance of xorg??

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •