NVIDIA FXAA Anti-Aliasing Performance
Phoronix: NVIDIA FXAA Anti-Aliasing Performance
A few months back the NVIDIA Linux driver introduced support for FXAA, a new anti-aliasing method. In this article are benchmarks of FXAA plus more information on this post-process shader-based anti-aliasing method.
Comparison against MSAA would be nice.
FXAA is not higher quality than MSAA/CSAA variants. It does a good job evening out the sharp edges, but at the cost of adding a blur to the picture. Sharp details such as texture details will also be lost to some degree.
Michael: It would be great if you could do a quality comparison of different AA modes (screenshots in png of course), but there are still somewhat limited amounts of high quality graphics available in Linux games. A proper AA setting becomes increasingly important when large amounts of transparent textures are used (typically for grass, foliage, fences, etc.).
It annoys me the amount of effort developers put into methods of hiding lack of detail, instead of developing new effective rendering algorithms. The raw power of current high end GPUs are immense if you can utilize it. My GTX 580 is capable of displaying 8M textured polygons at 70 fps in 1680x1050 with 32xCSAA and 16xAF applied.
sadly game developers target consoles first and the fix things back to run on windoze, so until the next generation consoles are out forget about smarter ways of rendering or better quality
Originally Posted by efikkan
Anyone else get the framerate halfed with mesa's mlaa?
Wow, that's useless. On Windows, FXAA costs me 2 or 3 FPS, not 20. This looks seriously botched on Linux.
This is how it should look like:
Notice how FXAA has an almost non-existent impact.
Last edited by RealNC; 09-18-2012 at 10:27 AM.