Announcement

Collapse
No announcement yet.

The Radeon RX Vega Performance With AMDGPU DRM-Next 4.21 vs. NVIDIA Linux Gaming

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • The Radeon RX Vega Performance With AMDGPU DRM-Next 4.21 vs. NVIDIA Linux Gaming

    Phoronix: The Radeon RX Vega Performance With AMDGPU DRM-Next 4.21 vs. NVIDIA Linux Gaming

    Given the AMDGPU changes building up for DRM-Next to premiere in Linux 4.21 that is on top of the AMDGPU performance boost with Linux 4.20, here are some benchmarks of Linux 4.19 vs. 4.20 Git vs. DRM-Next (Linux 4.21 material) with the Radeon RX Vega 64 compared to the relevant NVIDIA GeForce competition.

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Typo:

    Originally posted by phoronix View Post
    GTX 1070 now bto being faster than

    Comment


    • #3
      Interesting, so now the Vega 64 is between the 1070Ti and the 1080 (but more closer to the first one than the second one).
      ## VGA ##
      AMD: X1950XTX, HD3870, HD5870
      Intel: GMA45, HD3000 (Core i5 2500K)

      Comment


      • #4
        I've read somewhere, that Nvidia's performance is often faked by their reduced quality of colors and textures. I.e. while AMD render everything as prescribed, Nvidia cheat and skip computationally intense parts, which produces worse image, but looks like it's performing better. Is that correct?

        Comment


        • #5
          I wish AMD would make an insane compute unit gpu. I realize hbm2 is expensive as hell, but when you crack open a vega card, there is enough space to solder a Pi motherboard onto it. They have the real estate. They just need to kick it up a notch.

          And with 7nm, there will be more empty no man's land space on top of what we see today. Their radeon division really needs a proper roshambo, throat grabbing, shin kicking BFG card. From what I'm hearing, that isn't to be. But for once, just once, I would like to see them beat nvidia in the performance metrics. F the cost.

          Comment


          • #6
            Originally posted by ThoreauHD View Post
            Their radeon division really needs a proper roshambo, throat grabbing, shin kicking BFG card. From what I'm hearing, that isn't to be. But for once, just once, I would like to see them beat nvidia in the performance metrics. F the cost.
            Major performance gains are expected only in their post Navi architecture, which they dubbed "super-SIMD". Would be interesting to see how it will work out.

            See https://forum.level1techs.com/t/supe...acement/132791

            Comment


            • #7
              Originally posted by shmerl View Post
              I've read somewhere, that Nvidia's performance is often faked by their reduced quality of colors and textures. I.e. while AMD render everything as prescribed, Nvidia cheat and skip computationally intense parts, which produces worse image, but looks like it's performing better. Is that correct?
              That is a Windows thing from 15 years ago. Nvidia’s first Direct3D graphics card was the GeForce FX 5800 Ultra. They made it by bolting on Direct3D 9 onto their GeForce 4 series in a way that was not performant, presumably under the assumption that it would never matter. When this made their hardware perform terribly well before the GeForce 6 series was ready, they resorted to cheating to improve the performance of Direct3D games. As far as I know, the practice died out due to backlash. I have never heard of them doing this in their Linux drivers or in OpenGL games for that matter.

              For what it is worth, ATI has been caught cheating on numerous occasions:

              https://forums.anandtech.com/threads...eating.638328/
              https://forums.anandtech.com/threads...-2003.1087045/
              http://www.tomshardware.com/forum/75...2001se-figures
              http://www.tomshardware.com/forum/80...aught-cheating
              http://www.tomshardware.com/forum/298524-33-cheating-benchmarks-degrading-game-quality-nvidia

              Anyway, all of this nonsense is Windows specific. It was never a factor on Linux as far as I know. If anything, the lack of cheating on Linux is yet another reason why performance has always been lower on Wine than on Windows.
              Last edited by ryao; 02 December 2018, 03:38 PM.

              Comment


              • #8
                Originally posted by ryao View Post

                That is a Windows thing from 15 years ago. Nvidia’s first Direct3D graphics card was the GeForce FX 5800 Ultra. They made it by bolting on Direct3D 9 onto their GeForce 4 series in a way that was not performant, presumably under the assumption that it would never matter. When this made their hardware perform terribly well before the GeForce 6 series was ready, they resorted to cheating to improve the performance of Direct3D games. As far as I know, the practice died out due to backlash. I have never heard of them doing this in their Linux drivers or in OpenGL games for that matter.

                For what it is worth, I recall that ATI was caught cheating after their positions reversed when the GeForce 6 series was out. If you were to ignore the cheating and pick which of them renders things properly most often, that would be Nvidia. Their drivers have always been higher quality than ATI’s drivers.
                Did anyone perform tests recently based on image quality comparison, rather than raw framerate? Nvidia blob is shared between Windows and Linux, and such cheating can be one of the reasons they staunchly refuse to open up their drivers.

                Comment


                • #9
                  Those are nice numbers for vulkan!
                  Do other AMD GPUs profit as well from this?

                  Comment


                  • #10
                    Originally posted by shmerl View Post

                    Did anyone perform tests recently based on image quality comparison, rather than raw framerate? Nvidia blob is shared between Windows and Linux, and such cheating can be one of the reasons they staunchly refuse to open up their drivers.
                    I remember the drivers GUI panel having an option for "Balanced", "Best Quality", "Best Performance."

                    Is that what you're referring to? I feel like trying to compare the two would be of no use. Which begs the question, if we have to try _that_ hard to see a difference, does it even matter? I think most would rather have the performance.

                    There is also a similar argument to using High vs Ultra presets in games. Many cannot tell the difference between the two, but the frame rates drastically differ, as the game is more optimized for High.

                    Comment

                    Working...
                    X