Page 2 of 3 FirstFirst 123 LastLast
Results 11 to 20 of 25

Thread: Radeon R600 Gallium3D MSAA Performance Update

  1. #11
    Join Date
    Jan 2010
    Posts
    368

    Default

    Quote Originally Posted by FourDMusic View Post
    What is the technical reason behind 2x/4x and 6x/8x being almost identical in performance for a few scenarios?
    MSAA adds another processing step to the rendering pipeline: the resolve. This process combines subsamples to one sample for each pixel. This results in a constant performance hit. Beyond that, GPUs use various schemes to compress depth, stencil and color buffers, so that increasing sample depth does not result in much increase in memory bandwidth (after all, only the geometry edges can be different, GPUs take advantage of that) in typical cases. Therefore, performance is often quite similar for the different MSAA levels.

  2. #12
    Join Date
    Sep 2013
    Posts
    3

    Question Disable VSync in 13.04?

    Hey guys,

    How have you been disabling vsync in Ubuntu 13.04 with the R600g driver? I have a R770 (4870), and I have set up a custom xorg.conf and disabled sync to vblank on compiz, and I've disabled it in-game, but the FPS will not exceed 60 FPS.

    Any ideas?

  3. #13
    Join Date
    Nov 2010
    Posts
    42

    Default

    Quote Originally Posted by linuxguy View Post
    Hey guys,

    How have you been disabling vsync in Ubuntu 13.04 with the R600g driver? I have a R770 (4870), and I have set up a custom xorg.conf and disabled sync to vblank on compiz, and I've disabled it in-game, but the FPS will not exceed 60 FPS.

    Any ideas?
    if you already disabled Swapbufferswait on xorg.conf, maybe you have to disable vblank with driconf too, or manually in the ~/.drirc file, with the line:

    Code:
    <application name="Default">
                ...
                <option name="vblank_mode" value="0" />
                ...
            </application>

  4. #14
    Join Date
    Aug 2012
    Location
    Pennsylvania, United States
    Posts
    1,934

    Default

    Quote Originally Posted by Azpegath View Post
    I've been playing Left 4 Dead 2 on my Radeon 4850 (core i7 2.8GHz, 12 Gb RAM) the last week and the performance is waaaaay below the one on Windows. But it's actually playable if I lower the resolution to 1680x1050 and set all the detail settings to Low.
    Could we possibly get some benchmarks on L4D2? Either on different AMD cards, or a comparison with Windows, or Catalyst/FOSS.
    You're running a fast-paced game at over 1080p on an older (4850) card? Whats your monitor's native resolution that you have to "lower" the it to 1680x1050?

  5. #15
    Join Date
    Jun 2009
    Posts
    1,191

    Default

    Quote Originally Posted by Ericg View Post
    You're running a fast-paced game at over 1080p on an older (4850) card? Whats your monitor's native resolution that you have to "lower" the it to 1680x1050?
    well maybe he is using ubuntu default stack in 13.04 or something like that and ofc will be slow

  6. #16
    Join Date
    Aug 2012
    Location
    Pennsylvania, United States
    Posts
    1,934

    Default

    Quote Originally Posted by jrch2k8 View Post
    well maybe he is using ubuntu default stack in 13.04 or something like that and ofc will be slow
    Yes using Ubuntu rather than Fedora/Arch/Gentoo will negatively impact performance, but also he said that performance for him was "Way lower" on linux than Windows. Honestly if his monitor is higher than 1080p and he's pushing it using a 4850 at settings higher than low (which he implied he was) i would immediately assume the performance would be bad whether it was on Windows or Linux.

  7. #17
    Join Date
    Jun 2009
    Posts
    1,191

    Default

    Quote Originally Posted by Ericg View Post
    Yes using Ubuntu rather than Fedora/Arch/Gentoo will negatively impact performance, but also he said that performance for him was "Way lower" on linux than Windows. Honestly if his monitor is higher than 1080p and he's pushing it using a 4850 at settings higher than low (which he implied he was) i would immediately assume the performance would be bad whether it was on Windows or Linux.
    well i meant ubuntu 13.04 default stack [mesa 9.1 and linux-3.8] is very slow for this kind of games, ofc an 4850 will be less than reccomended at 1080p too but a recent stack will help a lot to up those FPS specially DPM and SB is default in mesa git/3.11

  8. #18
    Join Date
    Jul 2010
    Posts
    449

    Default

    Quote Originally Posted by Ericg View Post
    Yes using Ubuntu rather than Fedora/Arch/Gentoo will negatively impact performance, but also he said that performance for him was "Way lower" on linux than Windows. Honestly if his monitor is higher than 1080p and he's pushing it using a 4850 at settings higher than low (which he implied he was) i would immediately assume the performance would be bad whether it was on Windows or Linux.
    Left 4 Dead 2 is fast-paced, but it's also quite old now, and not very graphically intensive (by currenty standards). On Windows (7+8) I was able to run it at 1080p with all the graphical settings cranked up to max (though possibly not antialiasing - it's been a few months and I can't remember) and getting very good framerates.

  9. #19
    Join Date
    Dec 2010
    Location
    MA, USA
    Posts
    1,485

    Default

    Quote Originally Posted by archibald View Post
    Left 4 Dead 2 is fast-paced, but it's also quite old now, and not very graphically intensive (by currenty standards). On Windows (7+8) I was able to run it at 1080p with all the graphical settings cranked up to max (though possibly not antialiasing - it's been a few months and I can't remember) and getting very good framerates.
    If you're using catalyst drivers, try turning off catalyst A.I., or, turn off the tear-free setting. I found those have some pretty huge impact on games. Portal was almost unplayable on my HD5750 until I turned off catalyst AI.

  10. #20
    Join Date
    Sep 2013
    Posts
    3

    Thumbs up

    Quote Originally Posted by Bitiquinho View Post
    if you already disabled Swapbufferswait on xorg.conf, maybe you have to disable vblank with driconf too, or manually in the ~/.drirc file, with the line:

    Code:
    <application name="Default">
                ...
                <option name="vblank_mode" value="0" />
                ...
            </application>
    I did in fact create a file in /usr/share/X11/xorg.conf.d/ named 20-radeon.conf and set up the swapbufferswait to false and then I added the vblank_mode 0 in /etc/drirc, and I'm getting full FPS in the game engines! Thanks a ton!

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •