Page 3 of 4 FirstFirst 1234 LastLast
Results 21 to 30 of 36

Thread: Intel Hits "Almost There" GL 3.0 Support In Mesa

  1. #21
    Join Date
    Feb 2009
    Posts
    161

    Default

    "What the hell" comes from me not expecting such a misleading post after the very accurate elanthis' post.
    But ok, I'll try to be a little bit more serious, if that's what makes you happy and proves you my point.

    Please re-read what Carmack said. He did not mean D3D isn't good enough to justify transition, he meant that it would be too much hassle since he already has an established workflow around OGL. Please note that he also said he's loving the work he's doing on the x360.

    Application performance, while not theoretically connected diretcly to being OpenGL or Direct3D, still varies because of the simple fact that Direct3D reflects more acurately the actual hardware inner workings and driver developers pay much more attention to Direct3D for obvious reasons.

    Blaming D3D11 for Crysis 2 crashing/freezing is like blaming an operating system for closing an aplication that tries to dereference a null pointer.

    I don't know how you got from my Pong analogy to me being emotional, but I admire your creativity.

    Let me dumb it down for you: A game can use D3D11 and look like shit, that doesn't mean D3D11 is to blame.

    OpenGL has been playing catch-up with D3D for some years now. Khronos knows it, developers know it, 99% of the people in the technology world know it. Could it be that we're all wrong?

    Nobody ever said OpenGL couldn't show the same thing as D3D11. But there's more to a graphics API than meets the eye.

  2. #22
    Join Date
    Nov 2008
    Location
    Germany
    Posts
    5,411

    Default

    Quote Originally Posted by Drago View Post
    You sound as ignorant as Q, already.
    .
    I'm the reverence of ignorance ?

  3. #23
    Join Date
    Oct 2009
    Posts
    353

    Default

    Quote Originally Posted by mdias View Post
    "What the hell" comes from me not expecting such a misleading post after the very accurate elanthis' post.
    But ok, I'll try to be a little bit more serious, if that's what makes you happy and proves you my point.

    Please re-read what Carmack said. He did not mean D3D isn't good enough to justify transition, he meant that it would be too much hassle since he already has an established workflow around OGL. Please note that he also said he's loving the work he's doing on the x360.

    Application performance, while not theoretically connected diretcly to being OpenGL or Direct3D, still varies because of the simple fact that Direct3D reflects more acurately the actual hardware inner workings and driver developers pay much more attention to Direct3D for obvious reasons.

    Blaming D3D11 for Crysis 2 crashing/freezing is like blaming an operating system for closing an aplication that tries to dereference a null pointer.

    I don't know how you got from my Pong analogy to me being emotional, but I admire your creativity.

    Let me dumb it down for you: A game can use D3D11 and look like shit, that doesn't mean D3D11 is to blame.

    OpenGL has been playing catch-up with D3D for some years now. Khronos knows it, developers know it, 99% of the people in the technology world know it. Could it be that we're all wrong?

    Nobody ever said OpenGL couldn't show the same thing as D3D11. But there's more to a graphics API than meets the eye.
    bla bla.. I'm giving you an example of a AAA game with DX11 with visual glitches which hangs sometimes - and you still try to make a point it's not DX11 to blame, that it must be something else, right.
    Also, you didn't give any example that can't be accomplished with GL but can be done with DX, great - you're actually giving in that there's nothing that DX can do that GL can't. The catch up period with DX11 in terms of hw features is over too.
    And you still want me to take you seriously after that?
    Last edited by cl333r; 12-22-2011 at 07:28 AM.

  4. #24
    Join Date
    Feb 2009
    Posts
    161

    Default

    Quote Originally Posted by cl333r View Post
    bla bla.. I'm giving you an example of a AAA game with DX11 with visual glitches which hangs sometimes - and you still try to make a point it's not DX11 to blame, that it must be something else, right.
    Also, you didn't give any example that can't be accomplished with GL but can be done with DX, great - you're actually giving in that there's nothing that DX can do that GL can't. The catch up period with DX11 in terms of hw features is over too.
    And you still want me to take you seriously after that?
    Oh please, did you even read my post?

    You can't easily manage more than 1 GL context, you can't easily multithread, you don't have type safety/objects for resources, you have redundant API functions, you have a BLOATED API, your documentation sucks compared to D3D.

    Seriously, what kind of experience do you have with D3D?
    Why do you seem so emotionally attached to OGL?
    Can you list what are the advantages of OpenGL other than being crossplatform? Because it seems all about openness and still there's so many things wrong that people complain about and it just keeps the same.

  5. #25
    Join Date
    Jun 2011
    Posts
    109

    Default

    Quote Originally Posted by cl333r View Post
    bla bla..
    I thought you were complaining that people were using emotional responses...
    Quote Originally Posted by cl333r View Post
    I'm giving you an example of a AAA game with DX11 with visual glitches which hangs sometimes - and you still try to make a point it's not DX11 to blame, that it must be something else, right.
    Nowhere in your example proved that DX11 was to blame, and nowhere in the post you replied to claims that it can't be a bug in DX11, just suggesting that it's much more likely to be elsewhere. Not all DX11 games have the bugs you describe.
    Quote Originally Posted by cl333r View Post
    Also, you didn't give any example that can't be accomplished with GL but can be done with DX, great - you're actually giving in that there's nothing that DX can do that GL can't. The catch up period with DX11 in terms of hw features is over too.
    I'm pretty sure the "you can't do proper threading with OpenGL" example has been mentioned more than once.
    Quote Originally Posted by cl333r View Post
    And you still want me to take you seriously after that?
    I don't see how anyone could take you seriously after your last batch of comments even if they tried.

  6. #26
    Join Date
    Jun 2011
    Posts
    316

    Default

    Quote Originally Posted by eugeni_dodonov View Post
    I am actually unaware of any applications or games which require even gl 3.0, not speaking about gl 4.0 on Linux. So it is a bit of chicken-vs-egg problem - I don't know if such applications do not exist because nobody writing games for Linux needs gl 3.0+-specific extensions; or such developers are not writing games for Linux because it lacks such extensions in general drivers.

    Or maybe GL 2.0+ is just enough for pretty much everyone those days .
    Every major company in the gaming industry has been writing games for DirectX 10/11 for years already.. This would correspond to OpenGL 3.X and 4.X respectively..

    ID Software wrote a game in OpenGL called "RAGE", but requires OpenGL 3.3 as a minimum. ID Software really didn't have any hopes of releasing the game for Linux any time soon because of the lack of graphics driver support (and graphics driver performance problems). It was released for Mac OS X, Windows, and the PS3 game console.. It's a GREAT game.

    It's not a chicken or egg problem... It's a question of whether or not Linux and it's support libs and drivers has what it takes to be a serious gaming OS. If not, then these games just don't get released (Unreal Tournament 3, RAGE, etc.) for Linux even though they were written from the ground up in OpenGL. It's as simple as that. Game companies don't want to release a game on a platform if they think it's going to run like poo, because it can drag the reputation of the game down and hurt sales on other platforms. Or worse, nobody will buy it after going through the effort to port it to Linux.

  7. #27
    Join Date
    Oct 2009
    Posts
    353

    Default

    Quote Originally Posted by mdias View Post
    Oh please, did you even read my post?

    You can't easily manage more than 1 GL context, you can't easily multithread, you don't have type safety/objects for resources, you have redundant API functions, you have a BLOATED API, your documentation sucks compared to D3D.

    Seriously, what kind of experience do you have with D3D?
    Why do you seem so emotionally attached to OGL?
    Can you list what are the advantages of OpenGL other than being crossplatform? Because it seems all about openness and still there's so many things wrong that people complain about and it just keeps the same.
    "redundant API functions" and "bloated API" is pretty much the same, just as the term "legacy stuff", most of which has been deprecated, and some stuff is still there, which will likely be gradually fixed. No revolutions, only positive evolution like we have witnessed with GL's transition from 2.1 to 4.2.

    I'm not emotionally attached to GL, I'm saying cloning/using DX11 is likely to fail for the reasons I listed somewhere above, creating something new is too early, see explanation below.

    GL needs to be fixed, but it's nowhere as pressing as saying screw it we're gonna create a new standard or use DX11, that's silly.

    However, I'm in favor of rewriting GL completely by the time the next-gen (not "next-gen" as in marketing, but as in technology, like some real break-through) hw shows up, which might happen within like 5 to 15 years. So I'm not saying let's keep upgrading it forever, I'm just saying GL is good enough and for the time being abandoning it is reckless simply because of some non critical issues.

    To me, a critical point, is when, say we draw stuff not with triangles with slapped textures on them, but with real points/atoms/whatever which might happen, as I think, in 5 to 15 years - that would be a good enough reason to rewrite it, and DX11 might need a rewrite too, this way we don't have to go through a lot of trouble by not having to force the industry through an extra rewrite of the API.
    Last edited by cl333r; 12-22-2011 at 07:57 AM.

  8. #28
    Join Date
    Oct 2009
    Posts
    353

    Default

    Quote Originally Posted by gigaplex View Post
    I thought you were complaining that people were using emotional responses...

    Nowhere in your example proved that DX11 was to blame, and nowhere in the post you replied to claims that it can't be a bug in DX11, just suggesting that it's much more likely to be elsewhere. Not all DX11 games have the bugs you describe.

    I'm pretty sure the "you can't do proper threading with OpenGL" example has been mentioned more than once.

    I don't see how anyone could take you seriously after your last batch of comments even if they tried.
    Sure, not all DX11 games have bugs, but some do, and DX11 drivers do have bugs, don't you know that? Or did you think DX11 implementations are bug free?
    "Proper threading" "easy threading" is a reiteration of the same issue under different names - that you can use threads in GL if you use extra brain cycles, so it's an issue but it's not a critical one. I hope you got it now.

  9. #29
    Join Date
    Jun 2011
    Posts
    316

    Default

    Quote Originally Posted by cl333r View Post
    Also, you didn't give any example that can't be accomplished with GL but can be done with DX, great - you're actually giving in that there's nothing that DX can do that GL can't. The catch up period with DX11 in terms of hw features is over too.
    And you still want me to take you seriously after that?
    You're totally missing the point...
    If doing something requires 100 hours to do it in OpenGL and 10 hours to do it in DirectX. Game companies aren't going to use OpenGL.. It's that simple.. Whether or not OpenGL can do things that DirectX can't is not the core problem with OpenGL. It's the core problem with gaming on Linux (due to lack of driver support) but has nothing to do with problems with OpenGL itself.



    Quote Originally Posted by cl333r View Post
    To me, a critical point, is when, say we draw stuff not with triangles with slapped textures on them, but with real points/atoms/whatever which might happen, as I think, in 5 to 15 years - that would be a good enough reason to rewrite it, and DX11 might need a rewrite too, this way we don't have to go through a lot of trouble by not having to force the industry through an extra rewrite of the API.
    You mean where you supply a single very high resolution 3D model and the graphics hardware automatically breaks the 3D model into pixels/surfaces based on distance without requiring the game creator having to manually swap out high/med/low versions of the mesh at fixed distances? Yea, DirectX11 and OpenGL 4.X already has that, and the hardware that does it (Fermi) has been out for 18 months. See the Uniengine Heaven benchmark.

  10. #30
    Join Date
    Jun 2011
    Posts
    109

    Default

    Quote Originally Posted by cl333r View Post
    Sure, not all DX11 games have bugs, but some do, and DX11 drivers do have bugs, don't you know that? Or did you think DX11 implementations are bug free?
    If you read my post again, you'd notice that I didn't say it can't be a bug in the drivers (of which these belong to the hardware vendors, not DX11 itself). I'm not sure how we got to blaming application and driver bugs as a result of the API itself - if we're just comparing existing bugs and using them as proof of a flawed API then the Mesa/Gallium3D code is really giving OpenGL a bad reputation.
    Quote Originally Posted by cl333r View Post
    "Proper threading" "easy threading" is a reiteration of the same issue under different names - that you can use threads in GL if you use extra brain cycles, so it's an issue but it's not a critical one. I hope you got it now.
    Yes, they are the same issue under different names. I didn't claim it was multiple separate issues. It was one example, of which you claimed zero examples were given. And no, "using extra brain cycles" doesn't give OpenGL the same threading capacity. You'd be able to squeak out some threading gains after exerting significantly more development time, but internal to OpenGL are a large bunch of internal locks and global variables which cannot be worked around just by thinking harder. You can get the same pixel by pixel results between the two of them (what users care about), but from a developer perspective productivity matters (especially in commercial environments).
    Quote Originally Posted by Sidicas View Post
    You mean where you supply a single very high resolution 3D model and the graphics hardware automatically breaks the 3D model into pixels/surfaces based on distance without requiring the game creator having to manually swap out high/med/low versions of the mesh at fixed distances? Yea, DirectX11 and OpenGL 4.X already has that, and the hardware that does it (Fermi) has been out for 18 months. See the Uniengine Heaven benchmark.
    No, I think cl333r meant a completely different approach along the lines of voxels. What you described was tessellation, which is an enhancement that still uses triangles under the hood.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •