Page 3 of 5 FirstFirst 12345 LastLast
Results 21 to 30 of 45

Thread: Gallium3D Gets New Geometry Shader Support

  1. #21
    Join Date
    Oct 2007
    Location
    Under the bridge
    Posts
    2,144

    Default

    Quote Originally Posted by yotambien View Post
    I guess it makes sense. But wouldn't a more straightforward explanation be that in the past there were no OpenGL drivers shipped by default with Windows XP? I think you only got them once you downloaded the drivers from the manufacturer site, and probably many people that would potentially use Google Earth would not know anything about this. Nowadays Vista and W7 have OpenGL drivers included by default, so this would not be an issue with these systems.
    Could be. Then again, OpenGL seems to be the default setting, so maybe it's just a plain old driver issue as usual (it is OpenGL we are speaking about, after all).

    Vista/Win7 emulate OpenGL 1.4 via D3D by default. Better than XP (OpenGL 1.1 without hardware acceleration), but still far from good. To get real OpenGL support, you still need to install ICD drivers from the IHV's homepage (windows update won't install OpenGL ICDs).

  2. #22
    Join Date
    Sep 2006
    Posts
    714

    Default

    Quote Originally Posted by dl.zerocool View Post
    Well this is partially true, even if 50% of the market is using Intel IGPs which I don't really care about, if they have a bad support of OpenGL that's their problem to solve, not OpenGL one plus if people are stupid enough to use their ultra cheap solution then why should we care about them ?
    50%? Bah.

    Try 80%.

    Intel's IGP are the most popular video chipsets used by a longshot. Dwarfs Nvidia's market.


    I guess it makes sense. But wouldn't a more straightforward explanation be that in the past there were no OpenGL drivers shipped by default with Windows XP? I think you only got them once you downloaded the drivers from the manufacturer site, and probably many people that would potentially use Google Earth would not know anything about this. Nowadays Vista and W7 have OpenGL drivers included by default, so this would not be an issue with these systems.

    The reason DirectX gained popularity in the first place for games is because each vendor shipped a different OpenGL stack.

    This meant that developers had to troubleshoot and support no less then 3 different OpenGL implementations; each with their own quirks, limitations, and extensions.

    With DirectX they only had to support _one_ and that was Microsoft's. It does not really matter which API is better; they do just about the same thing. What matters is that they can always depend on DirectX working the same and being available.

  3. #23
    Join Date
    Oct 2007
    Location
    Under the bridge
    Posts
    2,144

    Default

    Quote Originally Posted by drag View Post
    The reason DirectX gained popularity in the first place for games is because each vendor shipped a different OpenGL stack.

    This meant that developers had to troubleshoot and support no less then 3 different OpenGL implementations; each with their own quirks, limitations, and extensions.

    With DirectX they only had to support _one_ and that was Microsoft's. It does not really matter which API is better; they do just about the same thing. What matters is that they can always depend on DirectX working the same and being available.
    Very true.

    The failure of the ARB to produce OpenGL 2.0 in time also played a significant role (if ARB had followed 3d labs' vision for GL2.0 back then, the graphics programming world might have been a completely different place nowadays). Not to mention the numerous fcuk ups from ARB/Khronos between GL2.1 and GL3.2: the 1+ year delay and consequent letdown that was 3.0, the inability to push essential extensions to core (like EXT_anisotropy), the inability to cater to developer needs (not being able to create a common binary shader format after 6+ years is simply ridiculous).

    So now we are in the position were GL3.2 (barely) reaches feature parity with DX10, a 3 year old standard, when DX11 is already available. Even worse, Intel is somewhere between GL1.4 and 2.0 in support; Apple is at 2.1; Linux is somewhere between ~1.3/~1.4 by default. This means you cannot even use the improved 3.x API without jumping through vendor-specific hoops.

    No wonder why developers have flocked to D3D and XNA. As things stand, OpenGL doesn't really stand a chance to become mainstream again.

    /rant

  4. #24
    Join Date
    Aug 2009
    Posts
    2,264

    Default

    Quote Originally Posted by BlackStar View Post
    As things stand, OpenGL doesn't really stand a chance to become mainstream again./rant
    Mac OS X is becoming increasingly popular (Linux too, but not as significant as Mac OS X)... OpenGL is the only choice here. With Microsoft losing crazy marketshare to Apple... I don't know. The days of Windows are numbered... DirectX will not survive, but how long that will last is as good as anyone's guess...

  5. #25

    Default

    Ho ! Haļ Nostradamus !

    To me, what you write sounds a bit pessimistic and unrealistic toward Microsoft (note that I'm not defending them here, just remember you the reality).

  6. #26
    Join Date
    Jun 2006
    Posts
    3,046

    Default

    Quote Originally Posted by BlackStar View Post
    Well, it is, if we want to be honest with ourselves. OpenGL 3.2 manages to close the gap somewhat, but it's still far behind D3D10 as far as API design, ease of use and stability is concerned. (Yes, it offers more functionality in general, but (a) it's still missing binary shaders and (b) 50% of the market is using Intel IGPs, which are synonymous with "bad OpenGL support" and (c) our OSS drivers don't even support GL2.1, much less 3.2).
    As for ease of use...heh... If you think coding for D3D is easy, I've got this bridge in Brooklyn to sell ya... Cheap price, even...

    The main reasons for why D3D was adopted is as follows:

    1) Microsoft implemented what the studios were asking for- which may/may not be a good thing.

    2) The ARB was slow to implement functionality that was needed by game devs and more modern 3D applications. They were more mired in the past with their CAD/Scientific rendering heritage- which is why the 3.0 move was disappointing. The CAD vendors don't want to overhaul their codebases and haven't moved from immediate mode rendering in the large.

    3) It's on Windows and with 1 and 2, it became more of a moot point- target the predominate platform with it's "native" rendering API first. If you can target OpenGL after the fact to get MacOS and other platforms, that's a win. Otherwise, you'll do good all the same if you don't.

  7. #27
    Join Date
    Jun 2006
    Posts
    3,046

    Default

    Quote Originally Posted by BlackStar View Post
    Very true.

    The failure of the ARB to produce OpenGL 2.0 in time also played a significant role (if ARB had followed 3d labs' vision for GL2.0 back then, the graphics programming world might have been a completely different place nowadays). Not to mention the numerous fcuk ups from ARB/Khronos between GL2.1 and GL3.2: the 1+ year delay and consequent letdown that was 3.0, the inability to push essential extensions to core (like EXT_anisotropy), the inability to cater to developer needs (not being able to create a common binary shader format after 6+ years is simply ridiculous).

    So now we are in the position were GL3.2 (barely) reaches feature parity with DX10, a 3 year old standard, when DX11 is already available. Even worse, Intel is somewhere between GL1.4 and 2.0 in support; Apple is at 2.1; Linux is somewhere between ~1.3/~1.4 by default. This means you cannot even use the improved 3.x API without jumping through vendor-specific hoops.

    No wonder why developers have flocked to D3D and XNA. As things stand, OpenGL doesn't really stand a chance to become mainstream again.

    /rant
    I'd have to say that'd be my gripes on the matter...in the large. You don't need the extra stuff for much of what you're doing coding-wise, but you've got to be much more careful with your coding if you don't have those API edges to get the same or similar results.

    And it's not wholly what you're saying. Microsoft didn't make it easy to choose OpenGL at the time they came out with DX8 and beyond. At that point, it was, while painful to use, much more credible due to feature set and the "easy" choice on Windows.

  8. #28
    Join Date
    Oct 2007
    Location
    Under the bridge
    Posts
    2,144

    Default

    Quote Originally Posted by V!NCENT View Post
    Mac OS X is becoming increasingly popular (Linux too, but not as significant as Mac OS X)... OpenGL is the only choice here. With Microsoft losing crazy marketshare to Apple... I don't know. The days of Windows are numbered... DirectX will not survive, but how long that will last is as good as anyone's guess...
    Yes, but Mac OS X still doesn't support OpenGL 3.x even one year after its release. Linux doesn't really support OpenGL 2.x reliably six years after its release. With, DirectX having XBox and pretty much every single non-indie game on its side, I just can't see DirectX being overturned by OpenGL any time soon.

    Were IHVs to get their acts together and provide OpenGL drivers you could count on (GL2.1 with GLSL 1.3 would be a very good baseline) and were Mac OS X and Linux to grab a 20-25% marketshare (so it would make sense for gamedevs to overcome the inertia of the win32/DX combo) I could see things starting to change. The synergy between desktop OpenGL, OpenGL|ES, WebGL and OpenCL would be too great to ignore then.

    However, something tells me that we'd sooner see Larabee sound the death knell of D3D/OGL before that kind of thing would come to pass.

  9. #29
    Join Date
    Oct 2007
    Location
    Under the bridge
    Posts
    2,144

    Default

    Quote Originally Posted by Svartalf
    As for ease of use...heh... If you think coding for D3D is easy, I've got this bridge in Brooklyn to sell ya... Cheap price, even...
    Easy? Hell no, graphics programming ain't easy in any sense of the word.

    However, this doesn't change the fact that OpenGL is more painful than D3D by at least an order of magnitude:

    (a) bind-to-edit makes it very difficult to create performant, generic middleware in OpenGL (think e.g. XNA). You have to: restore state before and after every middleware call (killing performance); or thrash state with every middleware call and reset it afterwards (killing performance); or tightly couple the middleware library with its consumer (lose generality).

    (b) not only do you have to work against the API (issue a) but you also have to work against the drivers. Nvidia is generally fine. AMD is generally fine, too (they have many bugs, but most are well documented with known workarounds). Intel is bad, really bad. Chrome and the rest are simply non-existent (better to install software mesa and run with that).

    Anecdote: I was asked to make a relatively simple, 10Kloc, GL3.x application to run on GL2.x + Intel. I finished the 3.x -> 2.x transition in two days, the result running fine on older nvidia & amd cards. I spent the rest two weeks working around Intel driver bugs, bogus shader errors, strange driver limitations (DX10-capable card with 2K texture size limitation? Why?), blue screens and missing features (no FBO blit?) before admitting defeat and simply dropping Intel from the supported list.

    (I would post a screenshot of the final result, but I don't have access to the dev machine right now. Intel's drivers somehow managed to render the correct geometry, but fill it with multicolored TV snow. Were the shaders reading from invalid / uninitialized samplers? That's how it looked like - yet the same code ran fine on Nvidia / AMD.)

    Personally, I wouldn't mind lower GL versions if the drivers were stable. I can work with that. What I can't work with are random driver bugs on perfectly valid code: if the driver reports glCompressedTexImage2D as supported, I expect this call to fill the texture with data or fail with an OpenGL error - getting back random data or a bluescreen is simply unacceptable.

    And it's not wholly what you're saying. Microsoft didn't make it easy to choose OpenGL at the time they came out with DX8 and beyond. At that point, it was, while painful to use, much more credible due to feature set and the "easy" choice on Windows.
    I'm not sure I can parse this correctly. When you say "it was ...", are you referring to OpenGL or D3D? (Myself, I first touched DX with version 9. I was an OpenGL-only guy before that, but DX9, for all its drawbacks, was much more stable at the time. DX9 offered FBOs and the FBOs worked. OpenGL offered FBOs, but the Moon, Sun, Jupiter and Carmack had to align before they'd work. ATi's OpenGL drivers didn't help, either - they are much better now.)
    Last edited by BlackStar; 12-26-2009 at 11:31 AM. Reason: sp

  10. #30
    Join Date
    Sep 2006
    Posts
    714

    Default

    Hopefully the Linux graphics situation will improve with Gallium.

    With Gallium you have a generic state tracker that will behave the same regardless of what hardware your using. As long as the driver supports Gallium then as far as the application goes it'll be compatible.

    Now for good performance this may not be true, but ideally we should only end up seeing two major OpenGL implementations:

    1. Proprietary Nvidia

    2. Open Source Nvidia, ATI, and Intel.

    (and 3. your stupid for buying Via graphics or anything else and expecting it to work well).

    This is not as good as the situation with DirectX and Windows, but it is much much much better then what we have to deal with now.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •