i think it'll take much quicker to catch up to more recent opengl specs. probably because opengl3 introduced the most features, and some features of higher specifications are already completed. it's not like mesa developers limited themselves to implement only 3.0 feature set.
OpenGL isn't a horrible solution (fortran and cobol are), however there are people with a horrible mood.
People bitching about GL are generally those who are in bad mood and need bitching - the internet is the ideal place.
You didn't discover America by saying OpenGL needs to be upgraded, and it has been upgraded with 3.0-3-3-4.2, and all emotionally stable people like it a lot, the deprecation mechanism is also reasonable cause doesn't involve reckless decisions. Bitching about threads is stupid, because under the hood it's still serialization and people with brains just add more code like one does with all other non threaded solutions which are plenty.
DX11 paradise? Hardly. I still have BIG issues with DX11 based games (crysis2 on DX11 on GTX 560Ti often fails, while on DX9 it worked fine).
Emotionally unstable people like to bitch how DX11 solves almost all problems and the programmers need to just hit random keys on the keyboard to create cool stuff, meanwhile I hardly notice the difference between DX9/10 with DX11 games, really, not to mention bugs in drivers, and that's while many emotinally unstable people like referring to DX11 as a bug-free solution.
Go take a hike.
GL 4.2 allows to accomplish every effect DX11 can, it's not just me, Carmack also said the difference isn't big enough for him to move to DX.
I'm (still) learning GL and I like it. It's clean, it's fun. Handling textures is little weird and it's a bit verbose, but Java is more weird has a lot more deprecated stuff (I do Java for 10 years, trust me, it has a lot more obsolete stuff than GL) and Java is certainly more verbose but it's still a success in critical markets. So GL is good enough and has enough market share.
Your big problem with GL lies like half a meter in front of the monitor and needs to get out and face the sun light a bit.
Bitching about threads in GL is like bitching that the C/C++ backends of GTK aren't thread safe either - that isn't a problem, I use pthreads with Gtk - so what, did my brains blow cause I had to do a few more brains cycles? Since when using threads on your own has become so hard that you have to bitch about it in the forums? I don't think you're so feeble minded, I just think you need to take a hike.
You managed mix emotions with a graphics API and programming languages that have nothing to do with the problem at hand, all in a single sentence, congrats.
Your post shows tons of ignorance regarding development of real world applications and games. Please, instead of hiking so much, invest more time in reading about threading, OpenGL and Direct3D. Please also read on why "serious" titles use an already made engine instead of talking to OGL/D3D directly. Just because you can, doesn't mean it's the best choice. Just because OGL can render triangles or tesselate meshes, doesn't mean it does it the best way.
As for your D3D11 games looking similar to D3D9 games... well, I can play Pong with the latest gfx cards too...
"What the hell" is proof of emotionally unstable behavior - as to "serious" people - I already said what Carmack thinks.
As to tons of ignorance - show me effects that can't be done with GL 4.2 but can be done with DX 11 - or will you ignore this central point and continue bitching about threads?
As to threading - I'll repeat - since when is it so difficult that you have to bitch about it?
And your Pong example as a response to my Crysis 2 example (with latest updates to the nvidia driver and to the game itself which still sucks and even freezes despite being a "serious" title) is plain goofy, which proves that your emotions (and maybe ignorance) are talking for you. Again show me effects that DX11 does that GL 4.2 can't do.
Last edited by cl333r; 12-22-2011 at 06:24 AM.
@cl333r, please stop bullshitting and insulting people. I like elanthis' informative posts, and I don't(neither is he) argue, you can't do stuff with OGL, you can with D3D. It is just an outdated API. Yes, you can write in C your own thread management. Yes, you can write in C++ your own thread management, but still they continue to evolve C++ standard with 0x11 to include native threading.
You sound as ignorant as Q, already.
PS: And Carmack said D3D was worse in the Quake times, but now is better than OGL, and he would use it, if it was it's legacy in knowledge, development tools, etc.
I'm not ignorant, I'm just pointing out that something that isn't critical and doesn't make your program faster or allow you to do richer effects - is indirectly advertised otherwise.
Going to DX11 is stupid because you have to do sync with Microsoft, if you don't then you have to get a lot of people to agree on a single new standard (on a fork of DX) - again - a crappy road to follow.
The best choice obviously is deprecating old stuff in GL and introducing new features - which is what GL is doing - it already deprecated fixed pipe funcs and introduced a lot of features, and rest assured more will follow.
Also, going to DX11 and dropping GL will take a lot of effort and over 10 years to accomplish (transition existing software/games/etc) - which could fail in the end for different reasons, like not getting enough market/mind share or support from devs/corps/driver devs - it the meantime there might be a shift in hw development which could require rewriting the API again. Trust me, in this case given all the unknown variables - improving GL in a reasonable fashion is the best solution.
And I'm not bullying or something since I'm not calling anyone crazy names and not using censored words either.
Last edited by cl333r; 12-22-2011 at 07:06 AM.
elanthis knows what he is talking about. His post sums up pretty much everything that is wrong with OpenGL. My experience is very similar.
1. OpenGL is a horrible API. Inconsistencies all over the place: glGenTextures vs glCreateShader vs glNewList; glBindTexture vs glUseProgram.
2. Its specifications are monstrous, inconsistent and downright buggy. There are features where no two vendors agree on the implementation, because the specs are self-contradicting. It is downright impossible to implement! Even trivial 7-year old stuff like uniform arrays is implemented differently on Ati, Intel and Nvidia. Try this if you don't believe me: create a uniform array with length=1 and try to fill it on all three vendors. Go ahead!
For extra fun, try the same with a length=1 varying array.
3. OpenGL is designed by an inept commitee with diverging visions for the future. GL3.0 was to be released along with DX10, and the initial API drops showed a tremendous improvements over GL2.1. Then Khronos disappeared for a year and finally showed up the following summer (complete radio silence till then) with a different GL3.0 version that was identical to GL2.1+extensions. Literally, they just folded GL2.1-level extensions to core and called that GL3.0. Oh yeah, and they added a deprecation model that's respected only by Apple (with a 3-year delay).
During this year-of-silence, the opengl.org forums reflected the communities reactions: at first, people where asking for information. Later they started becoming angry. Finally, they left - and most never returned. Opengl.org is now but a ghost of its former days.
4. OpenGL carries a 20-year old legacy that's reached the point where it is impossible to add new features to follow GPU development. Compute shaders required a whole new API (OpenCL) that's completely independent from OpenGL.
5. D3D11 offers features that cannot be implemented in OpenGL 4.x, period. Proper threading, for instance. Refer to elanthis' post for more information.
Even something as simple as asynchronous resource creation becomes impossible in OpenGL. Some drivers use global locks (create resource in thread #2, stall rendering in thread #1). Other drivers don't offer threading at all (create resource in thread #2, crash). If you are lucky this might work - if you aren't, you might use the resource before it is fully created (undefined results, have fun).
D3D11 is better than GL4.x in absolutely all regards, which is why there are no games written with GL3.x or 4.x in mind. They either use GL2.1 (e.g. Rage3d) or have moved to OpenGL ES 2.0 which is a cleaned up version of GL2.1.
Last edited by BlackStar; 12-22-2011 at 07:19 AM.