Announcement

Collapse
No announcement yet.

Qt 5.9 Beta Snapshot Released, Boasts Shader Binary Cache

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Qt 5.9 Beta Snapshot Released, Boasts Shader Binary Cache

    Phoronix: Qt 5.9 Beta Snapshot Released, Boasts Shader Binary Cache

    The first Qt 5.9 tool-kit beta snapshot is now available for testing as the next feature release for this widely-used, cross-platform toolkit...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    What the spec says about the binary format?Why the Mesa drivers don't have a implementation of this? bridgman marek BNieuwenhuizen

    Comment


    • #3
      Originally posted by andrei_me View Post
      What the spec says about the binary format?Why the Mesa drivers don't have a implementation of this? bridgman marek BNieuwenhuizen
      I believe that the spec just requires support for the API entrypoints, but does NOT require you to actually expose a non-0 number of binary formats.

      There has been talk on the mesa-dev list about using the shader cache recently added to mesa to provide the binaries for GL_ARB_get_program_binary, but there was some indecision with regards to how to represent the various GPUs/vendors supported by Mesa given the API from that extension. I wouldn't be surprised if someone hooked the two features up at some point, but it sounds like there's a few outstanding technical questions that need to be answered first.

      Comment


      • #4
        Originally posted by andrei_me View Post
        What the spec says about the binary format?Why the Mesa drivers don't have a implementation of this?
        It doesn't matter anymore. Since Mesa has its own shader cache, any shader cache outside of Mesa is redundant.

        Comment


        • #5
          Originally posted by marek View Post

          It doesn't matter anymore. Since Mesa has its own shader cache, any shader cache outside of Mesa is redundant.
          Well, Qt will be able to rely on that in 5 years time then... Live and die by LTS Linux

          Comment


          • #6
            Originally posted by marek View Post
            It doesn't matter anymore. Since Mesa has its own shader cache, any shader cache outside of Mesa is redundant.
            Actually they talk about it in the article and show that they still get better performance with their own cache.

            Comment


            • #7
              Originally posted by doom_Oo7 View Post

              Actually they talk about it in the article and show that they still get better performance with their own cache.
              They can't claim that because they can't test their cache with Mesa.
              ​​​​

              Comment


              • #8
                Originally posted by marek View Post

                They can't claim that because they can't test their cache with Mesa.
                ​​​​
                They show a minor improvement with their own vs Nvidia shader cache on Windows. Not a lot, though, and the Mesa shader cache likely makes up most of the difference from non-cached shaders already.

                Comment

                Working...
                X