Announcement

Collapse
No announcement yet.

Intel Enables Tessellation Shader Support In Open-Source Linux Driver

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Intel Enables Tessellation Shader Support In Open-Source Linux Driver

    Phoronix: Intel Enables Tessellation Shader Support In Open-Source Linux Driver

    As an exciting early Christmas present for Intel Linux users, ARB_tessellation_shader support has landed in Mesa Git as needed by OpenGL 4!..

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Merry Christmas! (:
    Free Software Developer .:. Mesa and Xorg
    Opinions expressed in these forum posts are my own.

    Comment


    • #3
      Originally posted by siavashserver
      B..b...but I wished for a real eight-core desktop processor
      8core desktop processor is so 2011

      Comment


      • #4
        Originally posted by Kayden View Post
        Merry Christmas! (:
        Merry Christmas (or whatever it is you're celebrating) and thanks for the present to all the folks working on it!

        Comment


        • #5
          Originally posted by pal666 View Post
          8core desktop processor is so 2011

          I know right? They sell 16 core/ 32 threads processors since 2014.

          Comment


          • #6
            2fps with R600g and dedicated card in Heaven, I wonder how these IGPs perform.

            Comment


            • #7
              Originally posted by siavashserver
              B..b...but I wished for a real eight-core desktop processor :P Joking aside, how much do you think dynamic level of detail through tessellation will improve performance on Intel in real world situations? (higher vertex shader load for skinned animations, and tiny triangles vs. geometry tessellation on the fly)
              I mean, it's an additional load on the GPU.. so it's going to perform worse than without tessellation in any scenario.

              Comment


              • #8
                Originally posted by Ancurio View Post

                I mean, it's an additional load on the GPU.. so it's going to perform worse than without tessellation in any scenario.
                Only benchmarking can tell. Remember that you're using less memory (therefore less memory bandwidth) and now you can put more stuff on the caches.

                Comment


                • #9
                  Originally posted by Ancurio View Post

                  I mean, it's an additional load on the GPU.. so it's going to perform worse than without tessellation in any scenario.
                  Not necessarily. Tesselation trades memory bandwidth for GPU compute resources. So if your application+hardware combination was bandwidth limited and was not fully utilizing the GPU otherwise, tesselation can improve performance by having to send less vertices to the GPU while still retaining the same quality.

                  Comment


                  • #10
                    Originally posted by Ancurio View Post

                    I mean, it's an additional load on the GPU.. so it's going to perform worse than without tessellation in any scenario.
                    Except when you're using the strange thing called Catalyst. It scores higher in Heaven with extreme tessellation than with no tessellation at all. Though it's a driver issue of course, technically you're absolutely right.

                    Comment

                    Working...
                    X