reverence software renderer!
Originally Posted by DanL
the question is not what "looks better" the question is : what is the reverence !
Originally Posted by DanL
Hmm, to me it seems that fglrx doesn't render reflected bloom. It makes things sharper, but that's probably not what was supposed to happen...
As for using software renderer as a reference... If my experience has taught me something, is that software renderers take a lot more shortcuts than accelerated ones, in order to overcome the fact that they are a lot slower. The software renderer of Unreal couldn't do basic things like transparency and pixel smoothing...
I think they are talking about using the same OpenGL renderer, but running through a software OpenGL implementation in the driver (swrast or softpipe).
I once tried to run MS Train Simulator in Wine while my graphics driver wasn't working. The screen was full of artifacts, the grass was black for example. With the nvidia binary driver the same game worked well. swrast is clearly not a good reference. Perhaps llvmpipe is better?
Indeed, difference being that Nvidia was the one with awful driver "optimizations", like downgrading shader precision to 16 bits in order to improve their atrocious 3d Mark and Half Life 2 performance. Ati was using 24 bits throughout, with better image quality and better performance to boot.
Originally Posted by LinuxID10T
It was only with the Geforce 8 series that Nvidia caught up (and surpassed) Ati in rendering quality, mainly because DX10 had very strict IQ guidelines for conformance. Even so, Ati still has the edge in analogue (VGA) signal quality. Imagine my surprise when my half-a-decade old X1650Pro produced a cleaner picture than my brand new 9600GT (and 8600 and 7800 before that). I thought it was my monitor, but put the old card in and presto! a crystal clear image.
hey I'm really sick of these driver "optimizations" cutting down the quality
Originally Posted by BlackStar
i want 100% reference software rendering quality with RS-SSAA and the maximum AF quality.
i don't want shit on my screen.
because of this i love the opensource drivers because they don't put shit on my screen.
I think what's *actually* going here is that both r300g and r600g are rendering the bloom in the wrong place in the "Enemy" scene. It's been shunted down quite a lot, and possibly left a bit, compared to where it's supposed to be. This explains the odd pink glow on the floor, the dim wall pattern overlaid on top of the enemy, and the white glow to the side of the weapon muzzle: none of these are present in the Catalyst rendering, but the areas which *should* be bright are noticeably brighter.
Originally Posted by GreatEmerald
IMHO, this isn't a difference in image quality, it's just a bug in the open-source drivers. Or - possibly, but less likely - a bug in Nexuiz which does not manifest itself when using Catalyst due to differences in driver code.
ati+nvidia have a long history in cheating by closed source driver tricks.
Originally Posted by mangobrain
back in the time Quake3 comes out ati cheats 400% of speed by turning the quality to ZERO.
nothing new here nothing special but i don't believe the open source drivers "Cheat"
and i don't believe in magic bugs in opensource software opensource software tent to be bug free compared to the complete hopeless bugged closed source software.
there are scientific studies about bugs in open vs closed software and all studies say the same: open software do have less bugs.
there are not 1 single study with the result that closed software is more bugfree than open software.
What, you think I'm trolling, or somehow implying that closed-source software is better? Oh, teh lulz
Originally Posted by Qaridarium
I'm just pointing out that, in this specific scene, with these specific drivers, there appears to be a rendering issue. I'm not saying the closed-source drivers are perfect, or ever have been; all graphics card vendors have had major driver issues, and will continue to do so, purely due to the complexity of the drivers themselves, the complexity of the hardware, and the complexity of applications running on top of the drivers. However, the open-source drivers aren't perfect yet. I hope that one day they will be, but at the moment, they're not.
Also, hinting that Nexuiz might possibly have a bug doesn't mean I'm saying it's crap. I'm a programmer by profession, I deal with a lot of bugs in code both I and others have written, day after day.
To prove that I'm not in anyway biased against the drivers, here's a snippet of output from glxinfo on my box, which I ran just now, especially for you:
OpenGL renderer string: Gallium 0.4 on AMD CAYMAN
OpenGL version string: 2.1 Mesa 8.0
See that? Yup, I'm using r600g as I type this.
PNG instead of JPEG!
I get upset every time I see this.
IMO, anyone writing an article about image quality and presents the data (i.e images) using lossy compression (that had nothing to do with the comparison) just makes themselves look silly as it distorts the data.
I have unfortunately seen this on other tech sites as well.