Page 1 of 5 123 ... LastLast
Results 1 to 10 of 50

Thread: The Status Of Gallium3D Drivers, State Trackers

  1. #1
    Join Date
    Jan 2007
    Posts
    15,638

    Default The Status Of Gallium3D Drivers, State Trackers

    Phoronix: The Status Of Gallium3D Drivers, State Trackers

    With the official documentation for the Gallium3D driver architecture being a bit dated, Corbin Simpson (a student X.Org developer that has largely been working on the Gallium3D driver for ATI R300 class hardware) set out to improve the situation. On the X.Org Wiki is now a Gallium3D status page that shows the current status of Gallium3D state trackers and pipes...

    http://www.phoronix.com/vr.php?view=NzQzOA

  2. #2
    Join Date
    May 2008
    Posts
    598

    Default

    Wasn't VMWare going to release a Gallium state tracker this summer for ATi GPU's???

  3. #3
    Join Date
    May 2008
    Posts
    598

    Default

    MESA support for Cell:

    http://mesa3d.org/cell.html

  4. #4
    Join Date
    Aug 2007
    Posts
    153

    Default

    A few things.

    - There is a winsys for ATI/AMD GPUs, it's called radeon and it lives in src/gallium/winsys/radeon. glisse wrote it, I refined it.

    - This matrix is still not filled out. r300g is probably the worst-off driver right now; softpipe is the best. I just didn't know the status of the other drivers. Jakob has already filled in i915, and I'm sure somebody can fill in the nouveau drivers.

    - A few of us have already started talking about how to deal with status updates and code changes, so the matrix will probably shift a lot in the next few weeks. This shouldn't be construed as big amounts of new development, just trying to make sense of what's already been written.

    ~ C.

  5. #5
    Join Date
    Feb 2007
    Posts
    87

    Default

    Is there a howto/tutorial/initial guide for writing drivers and state-trackers for Gallium3D? Something that can guide new developers not familiar with the architecture.

    I assume softpipe is a good start, but other than looking at the source code, are there any graphs/diagrams explaining the relationship between components? I've only seen the high level diagrams from Akadeny08 presentation on Gallium3D:
    http://akademy2008.kde.org/conferenc...kademy2008.pdf
    which is a good start, but not enough I think.

    Such tutorial should include instructions how to setup gallium3d (for instance, how do you use softpipe?), how to prepare the build environment and the diagrams I mentioned above (with relevant text of course)

    what do you think? Too early for this? (APIs et al. in flux, etc)
    Last edited by ioannis; 08-08-2009 at 07:00 AM.

  6. #6
    Join Date
    Aug 2008
    Location
    Finland
    Posts
    1,806

    Default

    I'd be very interested to hear what slide 23 is actually about. Imo you could interpret it so that you could have Direct3D over Gallium3D on Linux (just as easily as OpenGL over Gallium3D on Windows); did Tungsten Graphics actually mean that in 2008?
    Nevermind, commented on elsewhere. Apparently the slide is misleading.
    Last edited by nanonyme; 08-08-2009 at 08:22 AM.

  7. #7
    Join Date
    Aug 2008
    Location
    California, USA
    Posts
    196

    Default

    Would this not theoretically mean we can get rid of WineD3D? Does this have the potential to yield higher performance than Microsoft's implementation?

  8. #8
    Join Date
    Dec 2007
    Location
    Germany
    Posts
    365

    Default

    Quote Originally Posted by wswartzendruber View Post
    Would this not theoretically mean we can get rid of WineD3D? Does this have the potential to yield higher performance than Microsoft's implementation?
    Native driver support will always be faster than any compatibility layer, so it would have the potential to yield higher performance.
    There are just two problems here:
    We have OpenGL 1.x, 2.x and 3.x, all of which need a somewhat huge amount of time to implement, there's OpenCL and video decoding (I probably missed some other state trackers), adding a D3D state tracker would make things yet even more difficult, and we just don't have enough developers (and the few developers we have don't have enough time) to do this.
    Also keep in mind that any binary driver is unlikely to ever implement native D3D support for Linux; Still, the compatibility layer run with a binary driver's OpenGL implementation will always be faster than any OSS D3D support (fine tuning and stuff).
    The second point is that Wine probably wouldn't be using the native support anyways, which has a number of reasons. For one, it'd be hard to implement D3D support in a way that covers everything from D3D7 to D3D10. Also, we still needed to keep around the old code for compatibility purposes - e.g. for non-Gallium3D OSS drivers and the binary blobs.

    The whole "implement D3D using Gallium3D" thing would basically just lead to more work with only a bit (if at all) performance gain, time would be better spend on improving current drivers and the compatibility layer.

  9. #9
    Join Date
    Aug 2008
    Location
    Finland
    Posts
    1,806

    Default

    Quote Originally Posted by wswartzendruber View Post
    Would this not theoretically mean we can get rid of WineD3D? Does this have the potential to yield higher performance than Microsoft's implementation?
    Theoretically at best the same. The slidesheet claims the implementation of Windows driver stack is pretty much the same architecture-wise as it would be if you had Gallium3D and then implemented D3D API on top of it (probably using Wine since there's probablems with D3D requiring parts of WinAPI and graphics developers pretty much don't want a WinAPI implementation in the graphics drivers )
    We can face the reality that we'll forever want something like Wine on Linux but exactly where this layer is optimal is still a bit of an open question. Would need to be evaluated whether integrating more deeply into Gallium3D would bring benefits. (especially a HLSL->TGSI mapping would be interesting since then you wouldn't have to do HLSL->GLSL->TGSI; I've been told GLSL and HLSL don't map optimally anyway but TGSI is designed to be more generic so it could actually work better there)
    Also from Wine wiki: "WineD3D eats a number of shader constants for emulating d3d/opengl differences. This causes issues for games that use the maximum number of shader constants (especially SM3.0 games). This causes issues on Geforce6/7 and Radeon X1*00 cards which offer 256 vertex constants of which Wine eats easily 20 and the games expect they can use all 256.."
    Last edited by nanonyme; 08-08-2009 at 10:33 AM.

  10. #10
    Join Date
    Aug 2008
    Location
    Finland
    Posts
    1,806

    Default

    Quote Originally Posted by NeoBrain View Post
    Native driver support will always be faster than any compatibility layer, so it would have the potential to yield higher performance.
    You don't believe getting rid of an emulation level would yield higher performance? (as in, you wouldn't have to do D3D->OpenGL conversion anymore)
    Quote Originally Posted by NeoBrain View Post
    There are just two problems here:
    We have OpenGL 1.x, 2.x and 3.x, all of which need a somewhat huge amount of time to implement, there's OpenCL and video decoding (I probably missed some other state trackers), adding a D3D state tracker would make things yet even more difficult, and we just don't have enough developers (and the few developers we have don't have enough time) to do this.
    Are you aware that in a state machine trackers are separate of each other? It's mostly a matter on whether Wine developers want to keep using OpenGL or not for D3D.
    Quote Originally Posted by NeoBrain View Post
    Also keep in mind that any binary driver is unlikely to ever implement native D3D support for Linux; Still, the compatibility layer run with a binary driver's OpenGL implementation will always be faster than any OSS D3D support (fine tuning and stuff).
    The second point is that Wine probably wouldn't be using the native support anyways, which has a number of reasons. For one, it'd be hard to implement D3D support in a way that covers everything from D3D7 to D3D10. Also, we still needed to keep around the old code for compatibility purposes - e.g. for non-Gallium3D OSS drivers and the binary blobs.
    Right, so we can't go for the likely better implementation (not using OpenGL) just because closed drivers aren't going to support that (they only want to use OpenGL)? Since when did Linux userland development start depending on the whim of proprietary driver coders? This is as close as native D3D as you can get.
    Last edited by nanonyme; 08-08-2009 at 11:00 AM.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •