Announcement

Collapse
No announcement yet.

The GPU Compute Performance From The NVIDIA GeForce GTX 680 To TITAN RTX

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • The GPU Compute Performance From The NVIDIA GeForce GTX 680 To TITAN RTX

    Phoronix: The GPU Compute Performance From The NVIDIA GeForce GTX 680 To TITAN RTX

    A few days back we posted initial Linux benchmarks of the NVIDIA TITAN RTX graphics card, the company's newest flagship Titan card shipping as of a few days ago. That initial performance review included a look at the TensorFlow performance and other compute tests along with some Vulkan Linux gaming benchmarks. In this article is a look at a more diverse range of GPU compute benchmarks while testing thirteen NVIDIA graphics cards going back to the GTX 680 Kepler days.

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Typo:

    Originally posted by phoronix View Post
    WattsUp Pro pwoer meter.
    Originally posted by phoronix View Post
    FAHBench as the Folding@Home benchmark had a negligible performance difference compared to the RTX 2080 Ti,
    You meant to put TITAN RTX somewhere in that sentence.
    Last edited by tildearrow; 25 December 2018, 02:02 PM.

    Comment


    • #3
      To be frank the RTX Titan is a little underwhelming to me coupled with the price tag it's just not that good of a card.

      Comment


      • #4
        Originally posted by pete910 View Post
        To be frank the RTX Titan is a little underwhelming to me coupled with the price tag it's just not that good of a card.
        It's not the card for consumers. It's for prosumers needing that amount of VRAM or AI training with tensor cores at the fp16 w/ fp32 accumulate(That precision is halved on Geforce Turings). Comparing to similar specced Quadro RTX 6000, these ones are cheap for those tasks.

        Comment


        • #5
          Is NVidia intentionally gimping performance of 1080 and 1080 TI to show Turing cards in good light? Or it's just Luxmark thing? It was worse or barely on par with 1070.

          Edit: nevermind, it's metrics per Watt.
          Last edited by reavertm; 26 December 2018, 02:31 PM.

          Comment


          • #6
            Would have been nice to include some mid-range cards such as 750Ti and 1050Ti.

            Comment


            • #7
              Interesting. Especially the performance differences between the 970 and 1060 while those two cards are marketed as having very similar performances.

              Comment


              • #8
                Originally posted by hussam View Post
                Would have been nice to include some mid-range cards such as 750Ti and 1050Ti.
                Just out of curiosity I ran OctaneBench on my 1050Ti and I've got 55.10 points. In comparison to 1060 6GB used in Michael's test you can imagine how 1050TI's "performance" would look in other tests .

                Comment


                • #9
                  Originally posted by c2p_ View Post

                  Just out of curiosity I ran OctaneBench on my 1050Ti and I've got 55.10 points. In comparison to 1060 6GB used in Michael's test you can imagine how 1050TI's "performance" would look in other tests .
                  That's very low indeed.

                  Comment


                  • #10
                    Titan RTX should be twice as fast as 2080Ti at machine learning. but apart from that it looks about the same in other tasks.

                    Comment

                    Working...
                    X