Page 1 of 4 123 ... LastLast
Results 1 to 10 of 31

Thread: A New Radeon Shader Compiler For Mesa

  1. #1
    Join Date
    Jan 2007
    Posts
    14,330

    Default A New Radeon Shader Compiler For Mesa

    Phoronix: A New Radeon Shader Compiler For Mesa

    While Gallium3D is gaining a lot of momentum and has picked up a number of new state trackers (OpenVG, OpenGL ES, and OpenCL and OpenGL 3.1 is coming soon) and features (i.e. network debugging support) in recent months, there is still a lot of work left before this architecture will enter the limelight...

    http://www.phoronix.com/vr.php?view=NzQxMA

  2. #2
    Join Date
    May 2008
    Posts
    598

    Default

    Just wondering what the performance penalty approximately will be having to go though Gallium3D?

    For Linux I'd easier development comes before speed

    Let's say a year from now. How fast/complicated will it be to add support for Gallium3D to a new GPU?

  3. #3
    Join Date
    Jun 2009
    Location
    Elsewhere
    Posts
    89

    Default

    I would like to know where a shading compiler stands within the phases of development of a game. Does it have something to do with gaming at all or is it more generic? Can a shading compiler be used to optimize 3D rendering for a particular game or is it intended to optimize 3D rendering by the video driver only?

  4. #4
    Join Date
    Oct 2007
    Location
    Under the bridge
    Posts
    2,126

    Default

    Quote Originally Posted by Louise View Post
    Just wondering what the performance penalty approximately will be having to go though Gallium3D?

    For Linux I'd easier development comes before speed

    Let's say a year from now. How fast/complicated will it be to add support for Gallium3D to a new GPU?
    Gallium3D should be faster than Mesa, so it's a step forward no matter how you look at it.

    No idea about the difficulty of adding new Gallium drivers. As a potential metric, try comparing the source size of the mesa-rewrite and R300 Gallium drivers.

    I would like to know where a shading compiler stands within the phases of development of a game. Does it have something to do with gaming at all or is it more generic? Can a shading compiler be used to optimize 3D rendering for a particular game or is it intended to optimize 3D rendering by the video driver only?
    If I understand this correctly, the shader compiler is internal to the driver. Mesa and Gallium both compile shaders into some form of intermediate language, which is then compiled by the driver backend into native GPU binaries.

    Each Mesa driver seems to use a different intermediate language. Gallium uses a single language for all drivers. This announcement means that R300 Mesa and R300 Gallium will use the same intermediate language to simplify the lives of developers.

    I rather doubt the OSS drivers will ship any game-specific optimizations (unless some driver developer happens to be a WoW junkie, that is )

    Please correct me if I am wrong!

  5. #5
    Join Date
    May 2008
    Posts
    598

    Default

    Quote Originally Posted by BlackStar View Post
    Gallium3D should be faster than Mesa, so it's a step forward no matter how you look at it.
    Does this mean, that Gallium can completely deprecate MESA?

  6. #6
    Join Date
    Jun 2009
    Location
    Elsewhere
    Posts
    89

    Default

    Thanks for the info, BlackStar.

  7. #7
    Join Date
    Jan 2009
    Posts
    608

    Default

    Quote Originally Posted by VinzC View Post
    I would like to know where a shading compiler stands within the phases of development of a game. Does it have something to do with gaming at all or is it more generic? Can a shading compiler be used to optimize 3D rendering for a particular game or is it intended to optimize 3D rendering by the video driver only?
    Each game that uses shaders contains their source code most commonly in a textual form. That means shaders need to be compiled by the driver and optimized specifically for your graphics hardware everytime the game starts. This is quite common in PC game industry. The aforementioned shader compiler does exactly that.

  8. #8
    Join Date
    Oct 2007
    Location
    Under the bridge
    Posts
    2,126

    Default

    Quote Originally Posted by Louise View Post
    Does this mean, that Gallium can completely deprecate MESA?
    Gallium *is* Mesa (or part of Mesa to be exact). As far as I know, the old Mesa OpenGL stack will be replaced by Gallium, once the Gallium drivers are ready. The old stack will probably stay around for legacy purposes, but new drivers will probably target Gallium from the get-go.

    Quote Originally Posted by Eosie View Post
    Each game that uses shaders contains their source code most commonly in a textual form. That means shaders need to be compiled by the driver and optimized specifically for your graphics hardware everytime the game starts. This is quite common in PC game industry. The aforementioned shader compiler does exactly that.
    That's not exactly true. Most games ship precompiled shaders, simply because shader compilation takes a *lot* of time.

    On the other hand, OpenGL does not support precompiled shaders, forcing OpenGL programs to ship with shaders in source form. Most OpenGL developers have been asking for precompiled shaders for *years* (think 2003), but it seems that IHVs haven't been able to decide on a common format.
    Last edited by BlackStar; 07-25-2009 at 12:05 PM.

  9. #9
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,385

    Default

    Yep. In the case of Mesa, Gallium3D acts as a new internal API for hardware drivers. The "classic" HW driver API was fine when it was designed, but became complicated over the years as GPUs evolved and the API was extended to support both old fixed-function chips and newer shader-based chips. By replacing the older HW driver API with Gallium3D newer drivers can be written for a simpler and cleaner API.

    For "classic" mesa drivers, the API used a common IR for shader programs which looked very similar to the instructions used in the ARB_vertex_program / ARB_fragment_program extensions. If you scroll down to line 142 in the following link you can see the list of instructions passed to a Mesa driver under the pre-Gallium HW driver model, showing which instructions are needed for the older ARB and NV extensions, and which are needed for GLSL. In case you're wondering, yes the instructions needed for GLSL are harder to support than the ones which have already been implemented

    http://cgit.freedesktop.org/mesa/mes..._instruction.h

    Under a Gallium driver, shader programs are passed to the HW driver as TGSI instructions rather than the previous IR.

    EDIT - just read Nicolai's post - looks like the plan for now is to convert TGSI instructions into the prog_instruction set in the link above, allowing the existing shader compiler code to start supporting TGSI immediately. Nice.

    So... bottom line is that the current 3D stack is Mesa (about a million lines of code) running over per-GPU HW drivers via the "classic" HW driver API, which in turn run over libdrm and drm. The "new" 3D stack is Mesa running over per-GPU HW drivers via the Gallium3D HW driver API, again running over libdrm and drm.

    The neat thing about Gallium3D is that the API spec is less 3D-specific so it's easier to use the same drivers for other cool things like video and general purpose compute operations.
    Last edited by bridgman; 07-25-2009 at 12:29 PM.

  10. #10
    Join Date
    May 2008
    Posts
    598

    Default

    @BlackStar and bridgman: Sounds almost too good to be true

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •