Announcement

Collapse
No announcement yet.

AMD Lands Experimental NIR Support In RadeonSI

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • AMD Lands Experimental NIR Support In RadeonSI

    Phoronix: AMD Lands Experimental NIR Support In RadeonSI

    Adding to the exciting morning about OpenGL 4.6 and the Radeon RX Vega launch is the massive patch series adding NIR support to RadeonSI having been merged to Mesa Git...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Ooh.. New toy to play with.

    Does anyone know how well this integrates with the existing radeonsi/gallium threading/caching infrastructure? I'd assumes that the radeonsi threaded compiles and binary caching would be ok, but since TGSI isn't being used when this is enabled, we'd want/need something like a NIR->LLVM or NIR->BINARY cache to replace what was there before (if it's not there already)?

    Comment


    • #3
      Am I understanding this correctly if I assume that all the current functionality remains untouched and this is simply meant to enable SPIR-V (etc.) functionality already present in other drivers which produces NIR?
      Or is it another IR with some sort of optimization passes?

      Comment


      • #4
        Changing from TGSI to NIR is not an easy task because known OGL and D3D Gallium state trackers won't work. Now if someone wants to write a D3D9/11/12 front driver for NIR that would be great.

        Comment


        • #5
          Originally posted by GruenSein View Post
          Am I understanding this correctly if I assume that all the current functionality remains untouched and this is simply meant to enable SPIR-V (etc.) functionality already present in other drivers which produces NIR?
          Or is it another IR with some sort of optimization passes?
          I think the NIR path will only be taken when compiling SPIR/V, otherwise for OGL and D3D code it'll take the gallium path.

          Comment


          • #6
            Originally posted by duby229 View Post

            I think the NIR path will only be taken when compiling SPIR/V, otherwise for OGL and D3D code it'll take the gallium path.
            By default, yes. But I believe if you set the right debug environment variables, you can at least get the OpenGL paths to go: GLSL -> NIR -> LLVM -> GCN Binary. I recall a previous email where nhaenle mentioned that he used that path for bringing up some of this functionality before he started working on wiring up the SPIR-V bits.

            That won't work for the NINE (or any other) gallium state tracker, since that relies on TGSI at the moment.

            There is a bit of uncertainty for the future though, some of the mobile chips (freedreno, vc4?) do translate TGSI to NIR as well before compiling down to binaries, so there is a potential future where some state trackers *could* have their TGSI translated to NIR and then fed into the appropriate back-end... I'm not saying it's likely, or even easy, but it could happen.

            Comment


            • #7
              We are the NIR. YOu will be assimilated. Resistance is futile.

              (sorry, the heat's getting to me...)

              Comment


              • #8
                Besides the SPIR-V part, what should be expected from the user point of view of this? Any performance benefit?

                Comment


                • #9
                  Originally posted by andrei_me View Post
                  Besides the SPIR-V part, what should be expected from the user point of view of this? Any performance benefit?
                  Nah, it's still in the get-it-working phase. Ultimately if they switch over the GL pipeline to go through NIR instead of TGSI, it's possible it could lead to a bit lower driver overhead, but I wouldn't expect anything crazy.

                  Comment


                  • #10
                    Hmmm https://www.youtube.com/watch?v=sCuco1s-0nY

                    Comment

                    Working...
                    X