Page 1 of 8 123 ... LastLast
Results 1 to 10 of 74

Thread: Radeon 3D Performance: Gallium3D vs. Classic Mesa

  1. #1
    Join Date
    Jan 2007
    Posts
    14,809

    Default Radeon 3D Performance: Gallium3D vs. Classic Mesa

    Phoronix: Radeon 3D Performance: Gallium3D vs. Classic Mesa

    Gallium3D, the graphics driver architecture started by Tungsten Graphics to overhaul the hardware driver support in Mesa, has been around for a few years but it is finally getting close to appearing on more desktop systems. Now that the Nouveau DRM code is in the mainline Linux kernel and its main 3D driver is Gallium3D-based, we will hopefully be seeing that adopted by more distributions soon -- it's already being flipped on with Fedora 13. On the ATI side the "r300g" Gallium3D driver that provides Gallium3D support for the R300-R500 (up through the Radeon X1000 series) is also being battered into surprisingly good shape. To see where the Radeon Gallium3D support is at for these older ATI graphics cards we have run a set of tests comparing the OpenGL performance under the latest Mesa 7.9-devel code with the Gallium3D driver to running the classic Mesa DRI driver.

    http://www.phoronix.com/vr.php?view=14690

  2. #2
    Join Date
    Sep 2009
    Posts
    33

    Default

    Probably better to be at a constant fps then climbing the highest hill as this would be best for power saving on laptops and smoothness. What's the point of rendering frames that missing the screen's refresh rate. 60fps should the be the cap on most gfx driver settings.

    Anyone think otherwise?

  3. #3
    Join Date
    Jan 2009
    Posts
    191

    Default

    so sad my old laptop still will not benefit from gallium at all with its discrete R500 (ATI Technologies Inc Mobility Radeon X2300) :'(

    Code:
    OpenGL vendor string: DRI R300 Project
    OpenGL renderer string: Mesa DRI R300 (RV515 718A) 20090101  TCL DRI2
    OpenGL version string: 1.5 Mesa 7.9-devel
    it strange how it is R500 and uses R300 code but not R300 gallium code.

  4. #4
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,458

    Default

    dfx, I don't understand your post. The 300g driver should work with 3xx through 5xx parts.

    The output you posted implies that you either haven't built the Gallium3D driver or at least aren't running it.

  5. #5
    Join Date
    Sep 2008
    Posts
    201

    Default

    Michael, thanks for talking to the developers about the tests and for supplying some analysis as to what may have been causing the observed results. It makes for a much more interesting benchmark!

  6. #6
    Join Date
    Nov 2007
    Posts
    1,024

    Default

    Quote Originally Posted by 0e8h View Post
    Probably better to be at a constant fps then climbing the highest hill as this would be best for power saving on laptops and smoothness. What's the point of rendering frames that missing the screen's refresh rate. 60fps should the be the cap on most gfx driver settings.

    Anyone think otherwise?
    Well, there was that one guy I got into an argument with some weeks back on this very topic... special people are a fact of life, I guess.

    When it comes to benchmarking, you usually have the vsync off to see how fast the engine/driver/level/whatever can actually run. When you're actually playing a game, though, you almost certainly want it on for a variety of reasons including the ones you listed.

    The problem from a benchmark perspective with having vsync on is that you won't be able to see the threshold by which you are passing the monitor refresh rate (be it 60hz, 75hz, or whatever). There's an important distinction between just barely managing 60hz and being able to manage 100hz. If the game is just barely hitting 60hz then a more complex scene or some other load on the system will cause it to drop below 60hz, which is problematic for a number of reasons. If the game is capable of running at 100hz, however, then we know that some unexpected spikes in scene complexity are far less likely to drop the frame rate below the desired level.

    Remember that with vsync, barely missing 60hz does not mean that you run at 59hz, it means that you run at 30hz. Each frame takes just long enough to miss the first vsync and ends up waiting on the second, halving the rate of frame updates. While it's possible a game could turn off vsync dynamically when the frame rate drops low enough, the result may end up even more jarring in that case, since the game would just periodically having tearing and an uneven frame rate.

    There are other problems with missing the 60hz mark too separate from the monitor refresh rate itself. Many game physics simulations rely on fixed intervals for simulation updates (especially any games where complex environmental physics actually matter to gameplay, which is more and more of them these days). Many games use a 1/60th second interval to match the lowest common denominator of monitor refresh rates, so dropping below 60hz would require two simulation updates per frame, which just makes things even slower.

    I'm not sure if the seemingly fixed frame rate on the Gallium driver is intentional or not (the numbers do look like vsync behavior on a 60hz display), but if it is, there should be a way to disable it for benchmarking purposes.

  7. #7
    Join Date
    Dec 2009
    Posts
    338

    Thumbs up Hands down!

    Quote Originally Posted by krazy View Post
    Michael, thanks for talking to the developers about the tests and for supplying some analysis as to what may have been causing the observed results. It makes for a much more interesting benchmark!
    +1
    Excellent article! Hands down for Michael, this time you really made interesting tests and interpreted them perfectly. Congratulations.

    On the other hand the results are pretty good. With this I think it's possible for r300g to be the default driver in Mesa 7.9.
    I hope it will be "finished" in the next month or so, so development effort can go into r600g!

  8. #8
    Join Date
    Oct 2008
    Posts
    3,133

    Default

    Quote Originally Posted by krazy View Post
    Michael, thanks for talking to the developers about the tests and for supplying some analysis as to what may have been causing the observed results. It makes for a much more interesting benchmark!
    Agreed.

    It's very interesting that those flat lines occur right at the 30fps and 60fps thresholds. It screams out to me that there's some kind of refresh-rate related issue going on, but I suppose it could just be random.

    Have you checked if the OpenGL2.1 support it advertises is working well? Do these tests exercise that, or are they strictly using 1.5?

  9. #9
    Join Date
    Feb 2010
    Location
    Canton, China
    Posts
    25

    Default

    I think you should edit your xorg.conf. If you want to try r300g driver, you should add Driver "r300g" into Section "Device" and remove Driver "radeon" or Driver "radeonhd" in Section "Device".

  10. #10
    Join Date
    Feb 2010
    Location
    Canton, China
    Posts
    25

    Default

    Quote Originally Posted by bridgman View Post
    dfx, I don't understand your post. The 300g driver should work with 3xx through 5xx parts.

    The output you posted implies that you either haven't built the Gallium3D driver or at least aren't running it.
    I think you should edit your xorg.conf. If you want to try r300g driver, you should add Driver "r300g" into Section "Device" and remove Driver "radeon" or Driver "radeonhd" in Section "Device".

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •