Originally Posted by nVidia
Why would nVidia maintain a separate codebase for OS X, when most of it is the same as for the other OSes? (And if anyone thinks that Apple or Microsoft write their own drivers: n00bs)
So back to my question: why would linux be any different from OS X, given that they both use the same OpenGL codebase?
Aside from that, Apple has some custom extensions (then again, so do X and Windows in the form of GLX and WGL), and Apple wants to keep the OpenGL version the same across all vendors, to avoid compatibility issues. So their OpenGL versioning is lowest-common-denominator pretty much. Even though nVidia and AMD have OpenGL 4.2 support in their codebase, they only expose version 3.2 on OS X (but they do expose their extensions).
But again, that's still plenty for a 10-year old DX9 game.
The common OpenGL framework layer is the software interface to the graphics hardware. This layer contains Apple's implementation of the OpenGL specification.
That's where the actual hardware-accelerated vendor driver plugs in. Which is what you'd normally be using under OS X. Apple does provide a software implementation as a fallback, but it is not suitable for playing games like L4D.Originally Posted by Apple
They even drew you a picture:
Last edited by Scali; 08-04-2012 at 08:40 AM.
That should easily be portable to linux.
But the biggest problem is that most linux drivers don't make use of Gallium. Certainly not the only two drivers that really matter: AMD's and nVidia's binary drivers.