Announcement

Collapse
No announcement yet.

NVIDIA GeForce RTX 2080 Ti Shows Very Strong Compute Performance Potential

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • NVIDIA GeForce RTX 2080 Ti Shows Very Strong Compute Performance Potential

    Phoronix: NVIDIA GeForce RTX 2080 Ti Shows Very Strong Compute Performance Potential

    Besides the new GeForce RTX 2080 series being attractive for developers wanting to make use of new technologies like RTX/ray-tracing, mesh shaders, and DLSS (Deep Learning Super Sampling), CUDA and OpenCL benchmarking so far on the GeForce RTX 2080 Ti is yielding impressive performance -- even outside of the obvious AI / deep learning potential workloads with the Turing tensor cores. Here are some benchmarks looking at the OpenCL/CUDA performance on the high-end Maxwell, Pascal, and Turing cards as well as an AMD Radeon RX Vega 64 for reference. System power consumption, performance-per-Watt, and performance-per-dollar metrics also round out this latest Ubuntu Linux GPU compute comparison.

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Typos:

    Originally posted by phoronix View Post
    The Radeon RX Vega 64 was left out of this global ook due to
    Ook!

    Originally posted by phoronix View Post
    The $1199 cost is much more justifable if you will constantly

    Comment


    • #3
      Very impressive considering the drivers are not even well optimized yet.

      Comment


      • #4
        Originally posted by Zeioth View Post
        Very impressive considering the drivers are not even well optimized yet.
        Nvidia or AMD?
        If Nvidia - I expect a year from now a lot better drivers now that they're rewriting some of their driver components in Rust.

        Comment


        • #5
          Originally posted by Zeioth View Post
          Very impressive considering the drivers are not even well optimized yet.
          Impressive Indeed,
          This card would be in fact more justified for GPU Computing than for gaming.

          Rocm now needs first to adjust itself on linux environment, at compatibility level and then I hope it will be a contender, when optimized.
          It is a big opportunity for AMD to gain market share in low-mid range markets( were majority of people put his money, and the Polaris line of cards, RX5xx, sit like a cherry on top of the cake..when supported ), and a leap up to the datacenter, where GPU prices spikes and the big companies sit.

          The market today, is not about Games anymore, its a mix, of games and GPU Computing..

          Comment


          • #6
            Originally posted by cl333r View Post

            Nvidia or AMD?
            If Nvidia - I expect a year from now a lot better drivers now that they're rewriting some of their driver components in Rust.
            Oh yes! The magic elixir of Rust. It's not the actual piss poor design of the algorithms, it's that goddamn C99 that's holding performance back.

            Comment


            • #7
              Originally posted by cl333r View Post

              Nvidia or AMD?
              If Nvidia - I expect a year from now a lot better drivers now that they're rewriting some of their driver components in Rust.
              Lets hope that that helps cards from last gen and a bit earlier. As I am still not that satisfied with performance for all games. Honestly going to have to say "but thats moot" to myself, cause some of the reasons are port based.
              Last edited by creative; 21 September 2018, 02:36 PM.

              Comment


              • #8
                Wow, much like Vega64, The 2080Ti is overpriced and overall underwhelming for gaming purposes, but sure performs great for compute purposes.
                Almost seems like this GPU should be part of the Titan series.

                Comment


                • #9
                  Prelude to NVidia Deep Mining Super Sampling?

                  Comment


                  • #10
                    Finally, what I've been waiting for.

                    Comment

                    Working...
                    X