Announcement

Collapse
No announcement yet.

NVIDIA/AMD Memory Information Extensions For Mesa/Gallium3D

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • NVIDIA/AMD Memory Information Extensions For Mesa/Gallium3D

    Phoronix: NVIDIA/AMD Memory Information Extensions For Mesa/Gallium3D

    Marek Olšák's latest Mesa patch series is hooking up support for the vendor-based OpenGL memory information reporting extensions to the Mesa and Gallium3D drivers...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    It's only done for R600 and later chips. Other drivers are unaffected, but they had to modified so as not to expose the extensions.

    Comment


    • #3
      Originally posted by marek View Post
      It's only done for R600 and later chips. Other drivers are unaffected, but they had to modified so as not to expose the extensions.
      Whoops, fixed, thanks. Hadn't looked at the individual patches but was just looking at the overview and saw those changes there. Thanks.
      Michael Larabel
      https://www.michaellarabel.com/

      Comment


      • #4
        I wonder if this is the extension that Steam (and for instance L4D2) uses to recognize what graphics card (and performance) I have. It's been working really poorly before.

        Comment


        • #5
          Originally posted by Azpegath View Post
          I wonder if this is the extension that Steam (and for instance L4D2) uses to recognize what graphics card (and performance) I have. It's been working really poorly before.
          It doesn't work for Borderlands 2. Not sure about L4D2.

          Comment


          • #6
            Originally posted by marek View Post

            It doesn't work for Borderlands 2. Not sure about L4D2.
            What about all the UE3 based games. They all do not recognize how much VRAM the gpu has and set it too low, see https://www.reddit.com/r/linux_gamin...g_and_texture/
            Will this change something?
            And is such a thing already somehow working with amdgpu? It might explain the perfomance gain of the 290 in bioshock infinite when switching from radeon to amdgpu...

            Comment


            • #7
              I don't think so. Applications can't tell a difference between radeon and amdgpu.

              Comment


              • #8
                Is it possible that we get sub-par performance because the game thinks we only have, say, 256 Mb of dedicated graphics memory while my card actually has 2 Gb. I remember reading something like that regarding L4D2, where I should change a setting to say that the game could use lots of memory. It would suck if the game would actually stream things from and to the card even though it doesn't actually have to.
                Last edited by Azpegath; 03 February 2016, 08:57 AM.

                Comment


                • #9
                  Originally posted by Azpegath View Post
                  I wonder if this is the extension that Steam (and for instance L4D2) uses to recognize what graphics card (and performance) I have. It's been working really poorly before.
                  Does this answer your question:

                  ?

                  Comment


                  • #10
                    Originally posted by V10lator View Post
                    Does this answer your question:

                    ?
                    Not sure, depends on how much VRAM you really have ;-)

                    Comment

                    Working...
                    X