Page 1 of 3 123 LastLast
Results 1 to 10 of 25

Thread: Radeon Gallium3D MSAA Performance (R300g)

  1. #1
    Join Date
    Jan 2007
    Posts
    14,912

    Default Radeon Gallium3D MSAA Performance (R300g)

    Phoronix: Radeon Gallium3D MSAA Performance (R300g)

    This week support for MSAA was finally added to the R300g driver so that the Radeon X1000 graphics cards and earlier can finally take advantage of anti-aliasing with this open-source Gallium3D driver. In this article are some benchmarks of the MSAA performance with a Radeon X1800XT, but even with this higher-end GPU when it comes to the R300g support coverage, the anti-aliasing performance isn't really usable.

    http://www.phoronix.com/vr.php?view=18360

  2. #2
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,463

    Default

    Michael, something seems funny with the numbers, which are uncharacteristically low even with MSAA off. At first I thought it might have been from the difference between X1800 and X1900 (I see more x1900s which have ~3x the pixel shader power) but when I looked at older articles even the x1800 seemed to be much faster :

    Previous article from Feb 2011 : http://www.phoronix.com/scan.php?pag..._compare&num=1

    All numbers are MSAA off :

    OpenArena v0.8.5 1920x1080 - this article 50.52, previous 118.00

    World of Padman v1.2 1920x1080 - this article 17.83, previous 118.73

    Urban Terror v4.1 1920x1080 - this article 21.20, previous 107.17

    Vdrift 2010-06030 1920x1080 = this article 6.76, previous 27.67

    There are all kinds of possible explanations -- regression in one of the components, new GL extensions causing apps to draw fancier stuff more slowly, compositor difference, some other hardware/software difference I missed -- but the performance drop seems relatively consistent across all the apps which seems odd.

    Thanks,
    John
    Last edited by bridgman; 01-10-2013 at 02:23 PM.

  3. #3
    Join Date
    Jan 2009
    Posts
    624

    Default

    I hope Mesa wasn't compiled with --enable-debug and the Ubuntu PPA with fresh Mesa wasn't used either, because the PPA is using the --enable-debug configure flag (and maybe some other PPAs as well, maybe even Ubuntu itself!!). The behavior of --enable-debug has been changed in Mesa. Newly the flag disables all gcc optimizations, which makes pretty much everything bloody slow.

  4. #4
    Join Date
    Oct 2011
    Location
    Rural Alberta, Canada
    Posts
    1,030

    Default

    As a question, why such a huge screen resolution for a card that is now approaching being seven years old? I still doubt the assertion that most people today have screens that large attached to their standard PCs, but back then it was even more doubtful. I know you want to test the hardware to it's fullest possible extent, but seeing that did kind of bother me.

  5. #5
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,463

    Default

    I wondered about that as well, but I guess (a) a lot of people probably buy new displays just to get the screen space and the high res comes for free, (b) the cards actually ran pretty fast at 1920x1080 last year with many apps over 100 FPS. Agree that a lower res might be needed with MSAA cranked up.

  6. #6
    Join Date
    Oct 2008
    Posts
    3,152

    Default

    Quote Originally Posted by Hamish Wilson View Post
    As a question, why such a huge screen resolution for a card that is now approaching being seven years old? I still doubt the assertion that most people today have screens that large attached to their standard PCs, but back then it was even more doubtful. I know you want to test the hardware to it's fullest possible extent, but seeing that did kind of bother me.
    Especially since the MSAA tests use tons of memory bandwidth compared to non-MSAA tests. The high-resolution just amplifies that effect, meaning MSAA might be a lot less trouble at a 1280x720 resolution that's probably more common at least for those older GPUs.

  7. #7
    Join Date
    Jun 2011
    Posts
    316

    Default

    Quote Originally Posted by smitty3268 View Post
    Especially since the MSAA tests use tons of memory bandwidth compared to non-MSAA tests. The high-resolution just amplifies that effect, meaning MSAA might be a lot less trouble at a 1280x720 resolution that's probably more common at least for those older GPUs.
    I agree. I don't like these benchmarks much as they don't seem very logical.

    Unless you buy more modern and high end cards, you either get high resolution *OR* you get a low resolution + MSAA. That's pretty much expected.

    Dell and HP are still selling a lot of 15" laptops with 1366x768 resolution screens and AMD graphics chips in the sub-$700USD market. I don't know why people buy those low-res laptops, but they sell well... So AA is still very important today as it was years ago, to compensate for low resolutions.

    What I would have preferred to see is low resolutions with MSAA vs. high resolutons without MSAA Performance comparison! That would have made sense and if there were some photos up that compared low resolution + MSAA to the high resolution in image quality and performance, it would be even better.. But alas, maybe we ask too much .

    Clearly if running a higher resolution looked better and performed better, we could see that MSAA still has a long way to go.. The benchmarks in the article though, really don't say much as most gamers already know turning on AA at a high resolution is a good way to make a nice slide-show on old or low-end cards.
    Last edited by Sidicas; 01-10-2013 at 10:32 PM.

  8. #8

    Default

    Quote Originally Posted by marek View Post
    I hope Mesa wasn't compiled with --enable-debug and the Ubuntu PPA with fresh Mesa wasn't used either, because the PPA is using the --enable-debug configure flag (and maybe some other PPAs as well, maybe even Ubuntu itself!!). The behavior of --enable-debug has been changed in Mesa. Newly the flag disables all gcc optimizations, which makes pretty much everything bloody slow.
    Note that in my PPA I added a patch to properly build with optimizations (disabling the recent changed behavior which is still being discussed on mesa-dev). I wasn't able to notice any measurable performance difference when enabling --enable-debug (tested some months ago), so I prefer to leave it enabled since it can be useful for debugging mesa as well as games problem. Also I prefer a game asserting and crashing than possibly locking up the system later on.

    The test I did with mesa from my PPA are here and are completely different from what get on this article:
    • no MSAA: 74.6 fps
    • MSAA 2x: 61.1 fps
    • MSAA 4x: 41.6 fps
    • MSAA 6x: 29.7 fps
    Last edited by oibaf; 01-11-2013 at 06:53 AM.

  9. #9

    Default

    I suspect the slowness of Phoronix test here is due to this.

  10. #10
    Join Date
    Jan 2009
    Posts
    624

    Default

    To me, there is a big difference in performance with --enable-debug, so big that I can't even use the flag for development. I supply my own gcc flags through the CFLAGS environment variable if I want some level of debugging with all gcc optimizations enabled.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •