Sorry for the nostalgia trip everyone.
Last edited by Hamish Wilson; 08-02-2012 at 10:23 AM.
Idiotic thing No1: OpenGL isn't multi-threading. Its actually multi-threading and super-threading: if you have a multilevel software rasterizer for LLVM for example, you can run the graphics part on many different instruction sets like CPU cores and GPU cores at the same time just with 2 back-ends, see PlayStation3 (6 SPEs /1.8macTflops used for graphics). Idiotic thing No2: Opengl is slower than Direct3D. They have only a small difference: OpenGL has extensions(ARB) of the protocol build inside GPUs. Even if you don't use assembly and use GLSL, you will still be faster. Important No3: OpenGL is "Open". So you can find it inside GPU drivers on all operating systems, and you can develop libraries for it: Imagination has OpenRL, wile NVidia develops Voxel-Raytracing solution for OpenGL (ID-tech 6).
Second, the valve team isn't exactly a newbie in dx, they basically used it from 2004 until now in their source engine so I don't think it's really that unoptimized, while it is a newbie in opengl arena. While they do use opengl in their ps3 port I wouldn't really say they could take optimizations from there and use it on the PC. They are different architectures that require quite a different approach.
Since any sane developer abstracts away the api we can expect that internally it's the engine with two different renderer implementations(both dx and opengl). Technically they should yield the same performance since only the ends are different but knowing that source is an object oriented c++ engine we can expect that it performs worse in opengl because opengl is a state machine that doesn't really mix well with an OOP approach(check out performance differences in OGRE renderer for dx and opengl and you'll see that opengl is a bit slower).
What I am trying to prove is that the opengl version should be slower that it's dx counterpart and since it isn't that is probably the linux kernel behaving better. I suspect that either the linux video drivers are better optimized(unlikely) or that the linux kernel, with its superior memory management and perhaps less layers than windows kernel make the difference.
We donīt know in which conditions the benchmark was done in linux, because a default ubuntu 12.04 install with nvidia blob performs much much worse than windows 7.
Also, a curious note. The reason why opengl is not widely used, is because the xbox alone sells more games than pc, mac, linux and similar
Last edited by narciso; 08-02-2012 at 12:24 PM.
The "consoles holding DX back" or "Xbox killing OGL" are both false statements, but repeat something enough...
I blame a combination of three factors for OGL becomming more or less a dead spec for PC gaming:
1: OGL 3.0 failing to deliver
2: Id Tech 4's steep HW requirements
3: Unreal 3 [seriously, how many major titles use Unreal these days?]
4: Poor driver support from all vendors except nVidia
In recent years even Autodesk has made the move from OpenGL towards Direct3D, and the main reason they cite is driver compatibility (http://archicad-talk.graphisoft.com/...lution_788.pdf)
The release of Rage also demonstrated the poor state of OpenGL drivers on Windows once again. The game just would not work properly on Radeons or Intel drivers. It worked on nVidia, but with bugs.
I had to try no less than 4 special 'Rage-optimized' beta drivers from AMD before my HD5770 could run Rage acceptably.
The reason for this is a bit chicken-and-egg I suppose. Because the API is near-dead anyway, no OpenGL software is released, so OpenGL driver issues are neither found nor fixed.
Last edited by Scali; 08-02-2012 at 01:21 PM.
For those of you wondering if the extra performance could be due to the game looking shittier on Linux
Valve Linux team says:
August 2, 2012 at 10:55 am
The image quality equals that seen on Windows with Direct3D.