Page 4 of 5 FirstFirst ... 2345 LastLast
Results 31 to 40 of 45

Thread: Gallium3D Gets New Geometry Shader Support

  1. #31
    Join Date
    Feb 2009
    Posts
    370

    Default

    Yeah, Gallium does look like the bee's knees. Can't wait to see it start to take over.

  2. #32
    Join Date
    Jan 2009
    Posts
    624

    Default

    Quote Originally Posted by BlackStar View Post
    you still need to install ICD drivers from the IHV's homepage (windows update won't install OpenGL ICDs).
    This is incorrect. I got full OpenGL Catalyst driver from windows update on a mobile Radeon.


    Quote Originally Posted by drag View Post
    Hopefully the Linux graphics situation will improve with Gallium.
    No, it won't, and the original topic is a perfect example. It was decided by certain developers that all Gallium drivers must support geometry shaders. From all the in-tree hardware drivers, 8 of them cannot support GS in hardware, so it must be done in software. I am curious who will be implementing it, given the incomplete state of most of the drivers and a lack of manpower.

    /rant

  3. #33
    Join Date
    Oct 2007
    Location
    Under the bridge
    Posts
    2,146

    Default

    Quote Originally Posted by Eosie View Post
    This is incorrect. I got full OpenGL Catalyst driver from windows update on a mobile Radeon.
    This has never happened on any of my nvidia, ati and intel systems. Additionally, I haven't been able to find any credible source that supports this. On the other hand, problems from missing ICDs are very common:

    - a recent example on opengl.org

    - a discussion on XBMC

    etc etc

    Quote Originally Posted by XBMC discussion
    Yeah, Microsoft does not and has never distributed drivers with Open GL ICDs.

    No, it won't, and the original topic is a perfect example. It was decided by certain developers that all Gallium drivers must support geometry shaders. From all the in-tree hardware drivers, 8 of them cannot support GS in hardware, so it must be done in software. I am curious who will be implementing it, given the incomplete state of most of the drivers and a lack of manpower.

    /rant
    And why do you think this is a problem? Instead of no geometry shaders, older cards will get geometry shaders emulated in software. This is an improvement - it will allow older cards to run software that they otherwise couldn't.

  4. #34
    Join Date
    Jan 2009
    Posts
    624

    Default

    Quote Originally Posted by BlackStar View Post
    older cards will get geometry shaders emulated in software
    Are you sure? I haven't said anything like that. There is a difference between "must" and "will". Thank god I didn't say drivers "must" support OpenGL 3.2. (and I do contribute code to Gallium, which is why I am concerned about it)

    Quote Originally Posted by BlackStar View Post
    And why do you think this is a problem?
    Already answered:
    I am curious who will be implementing it, given the incomplete state of most of the drivers and a lack of manpower.
    ~ Marek

  5. #35
    Join Date
    Oct 2007
    Location
    Under the bridge
    Posts
    2,146

    Default

    Quote Originally Posted by Eosie View Post
    Are you sure? I haven't said anything like that. There is a difference between "must" and "will". Thank god I didn't say drivers "must" support OpenGL 3.2. (and I do contribute code to Gallium, which is why I am concerned about it)
    You most certainly did say "must" in your post:
    It was decided by certain developers that all Gallium drivers must support geometry shaders. From all the in-tree hardware drivers, 8 of them cannot support GS in hardware, so it must be done in software
    And I most certainly didn't say anything about OpenGL 3.2 in my reply.

    I simply cannot see how a software fallback for geometry shaders could be a bad thing. As far as I can tell, this code can be shared between all drivers and the effort, non-trivial as it might be, will certainly help the OpenGL stack move forward as a whole (more so than, say, implementing geometry shaders for R600+).

    Do you have a link for the developer discussion on this topic?
    Last edited by BlackStar; 12-29-2009 at 07:21 AM. Reason: More context in the quote

  6. #36
    Join Date
    Jan 2009
    Posts
    624

    Default

    Quote Originally Posted by BlackStar View Post
    Do you have a link for the developer discussion on this topic?
    http://old.nabble.com/geometry-shadi...p26920366.html

    ~ Marek

  7. #37
    Join Date
    Sep 2008
    Location
    Netherlands
    Posts
    510

    Default

    Quote Originally Posted by BlackStar View Post
    You most certainly did say "must" in your post:
    I simply cannot see how a software fallback for geometry shaders could be a bad thing. As far as I can tell, this code can be shared between all drivers and the effort, non-trivial as it might be, will certainly help the OpenGL stack move forward as a whole (more so than, say, implementing geometry shaders for R600+).
    If it gets implemented it won't be a bad thing. I think Eosie was just concerned that nobody would care enough, leaving you with a broken driver.

    At any rate, can't every card which supports OpenCL also support any new kind of shader that Microsoft can come up with? I'm not completely sure, but isn't a modern graphics card just a ridiculously parallel pipelined processor without dedicated parts, making OpenGL and OpenCL just abstraction layers?

    For cards that don't support OpenCL, I think it won't be a whole lot useful to implement geometry shaders. They won't be fast enough to run it with an acceptable framerate anyway. The same goes for any shader on cards that don't support GLSL. That will just kill the performance. That's why it could be that nobody cares about implementing it.

  8. #38
    Join Date
    Oct 2007
    Location
    Under the bridge
    Posts
    2,146

    Default

    Thanks!

    Quote Originally Posted by Remco View Post
    If it gets implemented it won't be a bad thing. I think Eosie was just concerned that nobody would care enough, leaving you with a broken driver.
    Well, I would hope that a driver that doesn't support geometry shaders at all wouldn't advertise EXT_geometry_shader or ARB_geometry_shader, meaning that nothing gets broken (correctly written programs must check for driver support before trying to use an extension).

    At any rate, can't every card which supports OpenCL also support any new kind of shader that Microsoft can come up with? I'm not completely sure, but isn't a modern graphics card just a ridiculously parallel pipelined processor without dedicated parts, making OpenGL and OpenCL just abstraction layers?
    Not really. DX11 hardware requires a blend of new programmable and fixed-function functionality for its tesselation shaders that (as far as I can tell) cannot be emulated on older hardware. Additionally, there are new features that are simply impossible on DX10- cards: double-precision math, 16K textures, new compression formats (BC6/BC7) and a few more.

    For cards that don't support OpenCL, I think it won't be a whole lot useful to implement geometry shaders. They won't be fast enough to run it with an acceptable framerate anyway. The same goes for any shader on cards that don't support GLSL. That will just kill the performance. That's why it could be that nobody cares about implementing it.
    The good thing about software emulation is that (a) you have a reference implementation to compare results with and (b) it allows people without DX10+ hardware to test and contribute code for newer features (e.g. help with the GL3.2 tracker, even without the hardware to back it).

    Note that many IGPs don't run vertex shaders on hardware, but they still manage to maintain acceptable performance for simple tasks. It's not too much of a stretch that geometry shaders might also perform adequately, given that the relevant hardware on the GPUs isn't terribly fast either.

    In any case, something is better than nothing. For one, I'd prefer Unigine Tropics to run at 1fps than not run at all. Small steps at a time!

  9. #39
    Join Date
    Jan 2009
    Posts
    624

    Default

    Well, even the latest graphics hardware has fixed-function dedicated parts, some of them are:
    - rasterizer (comes before the pixel shader)
    - blender and output merger (comes after the pixel shader)
    - tessellator (between the hull and domain shaders)
    - texture units

    The first three are not accessible in OpenCL. Also, from my experience, hardware interfaces appear to be designed tightly around major 3D and compute APIs. You can't schedule the shader cores directly, nor implement any other kind of shader the hardware wasn't designed for.

    ~ Marek
    Last edited by marek; 12-29-2009 at 10:37 AM.

  10. #40
    Join Date
    Oct 2008
    Posts
    3,149

    Default

    1. The whole reason they were talking about adding geometry shader support to all drivers was that the software support for them was already done. If the hardware doesn't support it, or no one had written the hardware support into the drivers yet, it could automatically fall back to using a vertex shader + shared routines in the draw module.

    2. AFAIK, the original decision to add support to all the drivers was reversed, because some of the other developers didn't want to advertise support for a feature that would be so slow on their cards since it would have to use a software fallback.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •