Page 1 of 3 123 LastLast
Results 1 to 10 of 31

Thread: A New Radeon Shader Compiler For Mesa

Hybrid View

  1. #1
    Join Date
    Jan 2007
    Posts
    15,080

    Default A New Radeon Shader Compiler For Mesa

    Phoronix: A New Radeon Shader Compiler For Mesa

    While Gallium3D is gaining a lot of momentum and has picked up a number of new state trackers (OpenVG, OpenGL ES, and OpenCL and OpenGL 3.1 is coming soon) and features (i.e. network debugging support) in recent months, there is still a lot of work left before this architecture will enter the limelight...

    http://www.phoronix.com/vr.php?view=NzQxMA

  2. #2
    Join Date
    May 2008
    Posts
    598

    Default

    Just wondering what the performance penalty approximately will be having to go though Gallium3D?

    For Linux I'd easier development comes before speed

    Let's say a year from now. How fast/complicated will it be to add support for Gallium3D to a new GPU?

  3. #3
    Join Date
    Jun 2009
    Location
    Elsewhere
    Posts
    90

    Default

    I would like to know where a shading compiler stands within the phases of development of a game. Does it have something to do with gaming at all or is it more generic? Can a shading compiler be used to optimize 3D rendering for a particular game or is it intended to optimize 3D rendering by the video driver only?

  4. #4
    Join Date
    Jan 2009
    Posts
    625

    Default

    Quote Originally Posted by VinzC View Post
    I would like to know where a shading compiler stands within the phases of development of a game. Does it have something to do with gaming at all or is it more generic? Can a shading compiler be used to optimize 3D rendering for a particular game or is it intended to optimize 3D rendering by the video driver only?
    Each game that uses shaders contains their source code most commonly in a textual form. That means shaders need to be compiled by the driver and optimized specifically for your graphics hardware everytime the game starts. This is quite common in PC game industry. The aforementioned shader compiler does exactly that.

  5. #5
    Join Date
    Oct 2007
    Location
    Under the bridge
    Posts
    2,146

    Default

    Quote Originally Posted by Louise View Post
    Just wondering what the performance penalty approximately will be having to go though Gallium3D?

    For Linux I'd easier development comes before speed

    Let's say a year from now. How fast/complicated will it be to add support for Gallium3D to a new GPU?
    Gallium3D should be faster than Mesa, so it's a step forward no matter how you look at it.

    No idea about the difficulty of adding new Gallium drivers. As a potential metric, try comparing the source size of the mesa-rewrite and R300 Gallium drivers.

    I would like to know where a shading compiler stands within the phases of development of a game. Does it have something to do with gaming at all or is it more generic? Can a shading compiler be used to optimize 3D rendering for a particular game or is it intended to optimize 3D rendering by the video driver only?
    If I understand this correctly, the shader compiler is internal to the driver. Mesa and Gallium both compile shaders into some form of intermediate language, which is then compiled by the driver backend into native GPU binaries.

    Each Mesa driver seems to use a different intermediate language. Gallium uses a single language for all drivers. This announcement means that R300 Mesa and R300 Gallium will use the same intermediate language to simplify the lives of developers.

    I rather doubt the OSS drivers will ship any game-specific optimizations (unless some driver developer happens to be a WoW junkie, that is )

    Please correct me if I am wrong!

  6. #6
    Join Date
    May 2008
    Posts
    598

    Default

    Quote Originally Posted by BlackStar View Post
    Gallium3D should be faster than Mesa, so it's a step forward no matter how you look at it.
    Does this mean, that Gallium can completely deprecate MESA?

  7. #7
    Join Date
    Oct 2007
    Location
    Under the bridge
    Posts
    2,146

    Default

    Quote Originally Posted by Louise View Post
    Does this mean, that Gallium can completely deprecate MESA?
    Gallium *is* Mesa (or part of Mesa to be exact). As far as I know, the old Mesa OpenGL stack will be replaced by Gallium, once the Gallium drivers are ready. The old stack will probably stay around for legacy purposes, but new drivers will probably target Gallium from the get-go.

    Quote Originally Posted by Eosie View Post
    Each game that uses shaders contains their source code most commonly in a textual form. That means shaders need to be compiled by the driver and optimized specifically for your graphics hardware everytime the game starts. This is quite common in PC game industry. The aforementioned shader compiler does exactly that.
    That's not exactly true. Most games ship precompiled shaders, simply because shader compilation takes a *lot* of time.

    On the other hand, OpenGL does not support precompiled shaders, forcing OpenGL programs to ship with shaders in source form. Most OpenGL developers have been asking for precompiled shaders for *years* (think 2003), but it seems that IHVs haven't been able to decide on a common format.
    Last edited by BlackStar; 07-25-2009 at 12:05 PM.

  8. #8
    Join Date
    Jan 2009
    Posts
    625

    Default

    Quote Originally Posted by BlackStar View Post
    That's not exactly true. Most games ship precompiled shaders, simply because shader compilation takes a *lot* of time.

    On the other hand, OpenGL does not support precompiled shaders, forcing OpenGL programs to ship with shaders in source form. Most OpenGL developers have been asking for precompiled shaders for *years* (think 2003), but it seems that IHVs haven't been able to decide on a common format.
    I know that, should have been clearer. (not sure about "most games", I personally checked BioShock and its shaders are shipped in HLSL, easily readable) However, the DX driver must still do the chip-specific optimizations at runtime, because the DX binary shader is in an intermediate representation. On the other hand, OpenGL ES has binary shaders through GL_OES_get_program_binary and IR may be chip-specific.

  9. #9
    Join Date
    Aug 2008
    Posts
    77

    Default

    Quote Originally Posted by BlackStar View Post
    On the other hand, OpenGL does not support precompiled shaders, forcing OpenGL programs to ship with shaders in source form. Most OpenGL developers have been asking for precompiled shaders for *years* (think 2003), but it seems that IHVs haven't been able to decide on a common format.
    In my opinion, not having a common binary format is an advantage. It means that
    a) IHVs can change the opcodes of their hardware to squeeze the most out of it, and
    b) as driver developers, we can use whatever representation we want internally.

    The only advantage of precompiled shaders is loading time. This can easily be achieved via a caching mechanism, so that's what ISVs should be asking for, if anything.

  10. #10
    Join Date
    Jun 2009
    Location
    Elsewhere
    Posts
    90

    Default

    Thanks for the info, BlackStar.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •