Announcement

Collapse
No announcement yet.

1080p NVIDIA Linux Comparison From GeForce 8 To GeForce 900 Series

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • 1080p NVIDIA Linux Comparison From GeForce 8 To GeForce 900 Series

    Phoronix: 1080p NVIDIA Linux Comparison From GeForce 8 To GeForce 900 Series

    Earlier this week I carried out an OpenGL performance comparison of NVIDIA GPUs going back 10 years that included 27 different graphics cards from the GeForce 8 series through the latest-generation GeForce 900 Maxwell graphics cards. In this weekend article are some complementary tests from this comparison with the OpenGL benchmarks at 1920 x 1080.

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Not bad for my GeForce GTX 960, although I do have an AMD A8-7600 APU and my 960 4GB Mini is the best upgrade money can buy. I would think my graphics card does not have to work so hard. I'm not sure if The Witcher 3 is going to be punishing for both my CPU and GPU. Maybe in my next build I might forgo my APU for just a CPU alone and go with a graphics card, as I don't know if Blender and pro audio applications such as Ardour could take advantage of HSA. With that said, HSA is brand new for now.

    In the meantime, I'm very happy with my Gigabyte GTX 960 4GB video card.

    Comment


    • #3
      Mike if you put that beer in front you can ask brewer to pay you for hidden advertising?

      Comment


      • #4
        Interesting. A quick comparison of the relative performance difference on these test and the high resolution test was a lot closer than I expected. Speaks well for the power of today's CPUs and efficiency of Nvidia's drivers. Thanks for the follow up.

        Comment


        • #5
          Now this comparison makes sense: no more 50x difference between 9800GTX and 980Ti, now it's from 5 to 10 times which makes perfect sense.

          Comment


          • #6
            Originally posted by GraysonPeddie View Post
            Not bad for my GeForce GTX 960, although I do have an AMD A8-7600 APU and my 960 4GB Mini is the best upgrade money can buy. I would think my graphics card does not have to work so hard. I'm not sure if The Witcher 3 is going to be punishing for both my CPU and GPU. Maybe in my next build I might forgo my APU for just a CPU alone and go with a graphics card, as I don't know if Blender and pro audio applications such as Ardour could take advantage of HSA. With that said, HSA is brand new for now.

            In the meantime, I'm very happy with my Gigabyte GTX 960 4GB video card.
            As you can see here: https://www.techpowerup.com/reviews/...tcher_3/3.html
            Witcher 3 is very taxing for a GTX 960, even with an almost top-notch CPU. It's one of the reasons I haven't played that thing even if I pre-ordered: I'm waiting for this years crop of cards to get a little more than 30FPS average on a mid-range card.

            Comment


            • #7
              Originally posted by bug77 View Post
              As you can see here: https://www.techpowerup.com/reviews/...tcher_3/3.html
              Witcher 3 is very taxing for a GTX 960, even with an almost top-notch CPU. It's one of the reasons I haven't played that thing even if I pre-ordered: I'm waiting for this years crop of cards to get a little more than 30FPS average on a mid-range card.
              Quite frankly, Windows tests aren't applicable to Linux on non-native games.

              Comment


              • #8
                Originally posted by tpruzina View Post

                Quite frankly, Windows tests aren't applicable to Linux on non-native games.
                Not directly, but I have yet to see such a title performing better on Linux than it does on Windows. On top of that, at least Nvidia offers basically the same performance on both platforms. So Windows tests at least set an upper bound for what you can expect on Linux.

                Comment


                • #9
                  Originally posted by bug77 View Post

                  As you can see here: https://www.techpowerup.com/reviews/...tcher_3/3.html
                  Witcher 3 is very taxing for a GTX 960, even with an almost top-notch CPU. It's one of the reasons I haven't played that thing even if I pre-ordered: I'm waiting for this years crop of cards to get a little more than 30FPS average on a mid-range card.
                  Thanks.

                  I wish there's a GTX 960 4GB listed in there, but it will probably not make that much of a difference. I will have to wait for the next wave of GPUs to come out and spend $300 just so I can play The Witcher 3.

                  Comment


                  • #10
                    Originally posted by GraysonPeddie View Post
                    Thanks.

                    I wish there's a GTX 960 4GB listed in there, but it will probably not make that much of a difference. I will have to wait for the next wave of GPUs to come out and spend $300 just so I can play The Witcher 3.
                    Yup, VRAM is no issue here. As you can see, Witcher 3 doesn't need 2GB even at 4k.
                    I prefer to spend $200-250 for a video card, but would spend $300 is the card is right (I think the only time i did it was for a GTX 260 from the first available batch).

                    Comment

                    Working...
                    X