Announcement

Collapse
No announcement yet.

Follow-Up Tests For DIRT Showdown On Linux With AMD vs. NVIDIA

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Follow-Up Tests For DIRT Showdown On Linux With AMD vs. NVIDIA

    Phoronix: Follow-Up Tests For DIRT Showdown On Linux With AMD vs. NVIDIA

    Yesterday I published the results of a 14-way graphics card comparison for AMD vs. NVIDIA Linux performance on DiRT Showdown, the latest AAA game to be ported over to Linux a few years after its Windows debut. This game was ported by Virtual Programming and utilizes their eON wrapper. In this article are more AMD vs. NVIDIA GPU tests on Ubuntu Linux for this game with slightly more demanding settings plus looking at the CPU and GPU utilization.

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    14% CPU and 36% GPU usage on average on catalyst.... is that a bug or a feature...?

    Comment


    • #3
      Catalyst still has serious issues, but the GPU/CPU usage graph is really useful. Either the CPU or the GPU should hit near 100% usage

      Comment


      • #4
        Originally posted by jakubo View Post
        14% CPU and 36% GPU usage on average on catalyst.... is that a bug or a feature...?
        If this is a feature it is a rather strange one...
        I believe that this eON system is an evolution of wine graphics translator which is a rather disappointing way to port a game to linux, but don't forget we are dealing with companies and companies won't just remake the whole graphics engine to make a game run natively with openGL. It takes money and time to do that and the fairly 5% of users that use Linux (I include the dual boot ones) worldwide isn't a big audience.
        These tests just show how bad is to port a game via an emulator.
        Commercial software for windows is still mostly 32 bit compiled (remember Skype for Linux also) many files in their newest programs dating from 2003 or earlier.
        I mean they don't change that for software they sell, and we expect them to rewrite whole 3D engines... If anyone thinks of that he is definately dreaming...
        Too bad for AMD for their blob now but I certainly like AMD more than Nvidia and intel graphics cause it has put a lot of effort at their 4 OSS drivers (r300, r600g, radeonSI, amdgpu) supporting Gallium 3D, unlike intel, and generally trying to get rid of fglrx too via this route so I will vote for AMD overall here.

        Comment


        • #5
          Originally posted by djdoo View Post

          If this is a feature it is a rather strange one...
          I believe that this eON system is an evolution of wine graphics translator which is a rather disappointing way to port a game to linux, but don't forget we are dealing with companies and companies won't just remake the whole graphics engine to make a game run natively with openGL. It takes money and time to do that and the fairly 5% of users that use Linux (I include the dual boot ones) worldwide isn't a big audience.
          These tests just show how bad is to port a game via an emulator.
          Commercial software for windows is still mostly 32 bit compiled (remember Skype for Linux also) many files in their newest programs dating from 2003 or earlier.
          I mean they don't change that for software they sell, and we expect them to rewrite whole 3D engines... If anyone thinks of that he is definately dreaming...
          Too bad for AMD for their blob now but I certainly like AMD more than Nvidia and intel graphics cause it has put a lot of effort at their 4 OSS drivers (r300, r600g, radeonSI, amdgpu) supporting Gallium 3D, unlike intel, and generally trying to get rid of fglrx too via this route so I will vote for AMD overall here.
          Actually the Linux Steam gaming marketshare is more like 0.9%. Even companies like Feral use compile time libraries like IndirectX to help convert DirectX code to OpenGL.

          FGLRX is still behaving poorly but eON games are mostly running well with the Nvidia blob. We'll have to see what sort of performance is in store for us with future AMDGPU with proprietary user space blob. Furthermore, RadeonSI and r600 will become very viable once OpenGL 4.x support is patched up. However, I think some of the performance disparity has to do with AMD not putting as many dirty hack optimizations in their driver code that are activated by game profiles. Linux Catalyst only has a few profiles, such as "hl2_linux" but Nvidia likely has many more.

          Comment


          • #6
            Originally posted by jakubo View Post
            14% CPU and 36% GPU usage on average on catalyst.... is that a bug or a feature...?
            It always run around 60 fps on average on all the cards on all resolutions on any settings... so that might be a feature

            Comment


            • #7
              Originally posted by Xaero_Vincent View Post

              Actually the Linux Steam gaming marketshare is more like 0.9%. Even companies like Feral use compile time libraries like IndirectX to help convert DirectX code to OpenGL.

              FGLRX is still behaving poorly but eON games are mostly running well with the Nvidia blob. We'll have to see what sort of performance is in store for us with future AMDGPU with proprietary user space blob. Furthermore, RadeonSI and r600 will become very viable once OpenGL 4.x support is patched up. However, I think some of the performance disparity has to do with AMD not putting as many dirty hack optimizations in their driver code that are activated by game profiles. Linux Catalyst only has a few profiles, such as "hl2_linux" but Nvidia likely has many more.
              The reality is that in High level APIs, things are heavily connected and synchronous, wile the driver has to follow states. This means that AMD lost the fight when thy choose SIMD hardware for Shader Models 4 and 5. Problems with big Units will not allow for efficient fill of the GPU. So lower FPS with lower consumption for Catalyst and even lower of both for Gallium. Nvidia with MIMD hardware wins on both (see Gallium and GT650). Today only small mobile GPUs and with smaller SIMD Units have good efficiency like MIMD, and only for Tile based Drivers with smaller future level like OGL-ES2-3. The situation can be resolved with Low Level APIs, that are more Asynchronous and efficient. Then probably Profiles and Hacks can become the past.

              Comment


              • #8
                Given that the GTX 970 is a $350 graphics card, the Radeon R9 Fury Linux performance is still miserable with that $550 graphics card having the same frame-rate.
                ... and by the same token the R9 285 Linux performance is pretty darned good with that $250 graphics card having the same frame-rate, right ?
                Test signature

                Comment


                • #9
                  And really R7 370 eldest GCN is fastest AMD card on FullHD... and here plays nearly the same as GTX 970 Does nVidia should fix their drivers or R7 370 worth its $150 price that is the question? Or maybe eON wrapper isn't somewhat goodish for comparison
                  Last edited by dungeon; 19 August 2015, 10:20 PM.

                  Comment


                  • #10
                    Originally posted by bridgman View Post

                    ... and by the same token the R9 285 Linux performance is pretty darned good with that $250 graphics card having the same frame-rate, right ?
                    Or like the R9 285 for $175 on Newegg.

                    Some of these benchmark results are just plain bizarre, though. Like how the R9 290 performs better than the Fury? Wut?

                    Comment

                    Working...
                    X