First of all. I want to say I applaud you for being one of the few groups who care to benchmark with OSS. Concerning the methodology of benchmarking of video drivers on Linux:
Your driver tests measure performance almost exclusively, and the results are pretty consistent, stable, unvarying. I'd like to suggest that benchmarking video drivers for speed doesn't necessarily mean the video drivers are better (even if they perform "faster"). Driver developers can easily forfeit the overall quality of video driver (on a more basic level than the quality settings you see in "nvidia-settings" and alike) for more video performance (or, in layman's terms: more FPS). Won't solely benchmarking for video performance encourage developers to do this? In many cases they have already started doing this (on Windows for years). I understand that it is tedious, and sometimes impossible to track the quality of closed source drivers. We should evaluate drivers to encourage the adherence to quality and OpenGL specifications.
I remember, in my techie youth, going on a crusade to find out the reason why I loved quake3 on Mac OS (9 and X) and Linux so much more than I did on Windows. It just looked so much better, no matter what settings I tried (not just image quality, but geometric depiction and other visualization characteristics). I eventually found out that this was do to variance in driver quality and vendor OpenGL specification. I hope the quality of Linux drivers doesn't diminish as the Windows ones did. Quality and flexibility are FAR more important that squeezing 3 more frames out of the most popular games. It is easy for a company to rationalize this for sales, but we don't have to; we are an open source community.
I hope to influence the future of Phoronix video driver testing with this information. If you, the benchmarkers, would like help or further information to be presented, please ask. Thanks.
We have written articles in the past on the image quality between drivers and graphics cards. However, you are correct in that a majority of the benchmarks compare the frame-rate. This is done so that the results are quantitative as simply looking at the screen and sharing how we feel the image quality looks can be subjective (and end-users can't compare to what we think of an image). However, we can try to make it a priority to do more image quality comparisons.
We value the comments and thoughts of our readers; if you or have any other suggestions or further thoughts on this matter feel free to share them.