Announcement

Collapse
No announcement yet.

AMD Planning To Enable GLAMOR By Default For R600 & Newer GPUs

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • AMD Planning To Enable GLAMOR By Default For R600 & Newer GPUs

    Phoronix: AMD Planning To Enable GLAMOR By Default For R600 & Newer GPUs

    Currently the xf86-video-ati DDX driver only uses GLAMOR acceleration (2D via OpenGL) when using GCN GPUs where there isn't any hardware-specific EXA 2D code-paths implemented. However, AMD developers are now planning to switch over all R600 GPUs and newer to using GLAMOR by default...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Why exactly keep EXA? Or will it eventually be removed?

    Also I'm a little confused about something that's a bit multi-layered:
    Do all/some AMD GPUs have specific hardware for rendering 2D graphics, and if so, does GLAMOR take advantage of it or does it strictly work on the 3D engine?

    Comment


    • #3
      With the EXA support not used, what advantage does xf86-video-ati have over the generic modesetting driver on GLAMOR-supporting hardware?

      Comment


      • #4
        Originally posted by schmidtbag View Post
        Why exactly keep EXA? Or will it eventually be removed?

        Also I'm a little confused about something that's a bit multi-layered:
        Do all/some AMD GPUs have specific hardware for rendering 2D graphics, and if so, does GLAMOR take advantage of it or does it strictly work on the 3D engine?
        GPUs up to and including r5xx have a dedicated 2D block, but starting with r600 the 2D block is gone so 3D engine only.

        The r600/HD2900 (but not the rv6xx derivatives) had CP microcode to provide limited 2D HW emulation using the 3D engine, but AFAIK we never used that emulation capability on Linux. I don't think it was used much if at all on Windows either.
        Test signature

        Comment


        • #5
          Originally posted by schmidtbag View Post
          Why exactly keep EXA? Or will it eventually be removed?

          Also I'm a little confused about something that's a bit multi-layered:
          Do all/some AMD GPUs have specific hardware for rendering 2D graphics, and if so, does GLAMOR take advantage of it or does it strictly work on the 3D engine?
          GLAMOR only uses the 3D engine.

          GPUs before Southern Islands had dedicated 2D hardware, which is used by the EXA backend. It's faster than using GLAMOR, and the only option on very old cards.

          Since SI (HD 7xxx) there's no 2D hardware and only GLAMOR is used.

          edit: Bridgman's answer is almost certainly more accurate.
          Last edited by FLHerne; 21 November 2016, 11:32 AM.

          Comment


          • #6
            Originally posted by bridgman View Post

            GPUs up to and including r5xx have a dedicated 2D block, but starting with r600 the 2D block is gone so 3D engine only.

            The r600/HD2900 (but not the rv6xx derivatives) had CP microcode to provide limited 2D HW emulation using the 3D engine, but AFAIK we never used that emulation capability on Linux. I don't think it was used much if at all on Windows either.
            So now I'm confused, I thought up to and including NI had a 2d engine? Wrong?

            Comment


            • #7
              Originally posted by duby229 View Post
              So now I'm confused, I thought up to and including NI had a 2d engine? Wrong?
              Right. Wrong

              The 2D engine was removed between r5xx and r6xx. The shader core also moved from (vector+scalar) SIMD to VLIW SIMD.

              The big change between NI and SI was shader core moving from VLIW SIMD to scalar SIMD, so another shader compiler sea change. The shader core change also meant that we would have had to make major changes to EXA, which IIRC used hand-assembled shaders at the time, so replacing it with GLAMOR seemed like a good idea.

              I always have to be careful not to mix up the 5xx/6xx transition with the NI/SI transition... but pretty sure I didn't do it this time.
              Last edited by bridgman; 21 November 2016, 12:10 PM.
              Test signature

              Comment


              • #8
                I admire the support for older cards. Gives me hope for my current GPU.

                Comment


                • #9
                  R600 being touched? Impossible. What's next? Fedora 26 not being delayed at all?

                  Comment


                  • #10
                    Every time GLAMOR is discussed, I wonder if there is anything (any 2D operation, etc.) that simply can't be implemented efficiently with OpenGL. Also I wonder if it would be useful to have Gallium3D state tracker that would do 2D, so it could do a little more than the OpenGL state tracker can when it's needed (well it probably would not need to do everything that OpenGL can). Then again, Mesa could always expose a new OpenGL extension to use with GLAMOR to speed it up, so maybe these are not issues at all. I know fairly little about 3D or OpenGL programming. I don't know if either of these were addressed in XDC2016, so I may be asking something that has already been answered...

                    Comment

                    Working...
                    X