Quote Originally Posted by artivision View Post
I'm really glad that someone understands. Games and benchmarks speak to the driver, not directly to the hardware. If the driver wants to cheat will cheat, there is not technology available to measure quality of the picture. In fact when you have 2x GPUs you only have +50% FPS, that's is because the driver goes in quality and precision mode, same with double the shaders.
The title of the article mentioned testing the drivers. Not the cards. So what's the point?

Furthermore, FPS is only a single (one of many) ratio's defining 'performance'. So an 50% increase in FPS will not increase performance with 50%.

Quote Originally Posted by artivision View Post
My opinion is this:

NV-Kepler= 3.2_Tflops@64bit_(Intel comparison) = 6.4@32bit_(AMD comparison) = 9.6@Fmac=trioperant_(AMD HD2000-6000, G80-300, PS3, XBOX360 comparison).

AMD-HD7000= 3.8_Tflops@32bit = 5.7@Fmac=trioperant.

Intel-4000= 170_Gflops@64bit = 340@32bit =510@Fmac=trioperant.
I have no idea what this is about.

Quote Originally Posted by artivision View Post
Also there is not an exact way to compare open source drivers with the closed ones, because the closed ones cheat.
The fact that closed source drivers does not make them cheat by definition.

Quote Originally Posted by artivision View Post
If you ask me Intels_open and Intels_closed are equals. Also they share the same OpenGL code.
First you assert that closed source drivers cheat, and now you say that in the case of Intel they don't because they share some code for rendering.

I don't think that the DRM intel driver part is shared with Windows.

Quote Originally Posted by artivision View Post
How the hell some of you figure out that are different? Make your brain think!
I just did, and you speak nonsense.