Announcement

Collapse
No announcement yet.

NVIDIA vs. Radeon With HITMAN On Linux: CPU Usage, Memory Usage

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • NVIDIA vs. Radeon With HITMAN On Linux: CPU Usage, Memory Usage

    Phoronix: NVIDIA vs. Radeon With HITMAN On Linux: CPU Usage, Memory Usage

    With the competitive RadeonSI vs. NVIDIA performance for HITMAN on Linux there have been some Premium reader requests for also taking a look at the CPU/RAM usage and other vitals while running this latest Feral game port on the different GPUs/drivers...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Wow the 1080 is getting badly CPU bottlenecked. Seriously, that's just bad.

    Comment


    • #3
      Michael:
      Your frametime graphs are almost unreadable, which is unfortunate since these should be more useful than average frame rate. What if you cut the graph to show the real data, and ignore the spikes in the beginning? What we want to see here is the variance of the frame rate. Ideally we should see the same "pattern" across different GPUs, proportional to their respective performance, like this random one from the Internet: http://www.guru3d.com/articles-pages...review,25.html

      Originally posted by gamerk2 View Post
      Wow the 1080 is getting badly CPU bottlenecked. Seriously, that's just bad.
      With only 67% utilization on such a fast CPU this game is clearly bloated. Way too much bloat…

      Comment


      • #4
        Originally posted by efikkan View Post
        Michael:
        Your frametime graphs are almost unreadable, which is unfortunate since these should be more useful than average frame rate. What if you cut the graph to show the real data, and ignore the spikes in the beginning? What we want to see here is the variance of the frame rate. Ideally we should see the same "pattern" across different GPUs, proportional to their respective performance, like this random one from the Internet: http://www.guru3d.com/articles-pages...review,25.html


        With only 67% utilization on such a fast CPU this game is clearly bloated. Way too much bloat…
        Patches welcome if you have an idea how to display the data more effectively.
        Last edited by Michael; 23 February 2017, 02:28 PM.
        Michael Larabel
        https://www.michaellarabel.com/

        Comment


        • #5
          Originally posted by Michael View Post

          Patches welcome if you have an idea how to display the data more effectively.
          I'm clueless on how to fix it, but there are many times when spikes and dips in the data skew your graphs. Sometimes when a graph is obviously skewed it makes sense to manually remove the spike or dip responsible. You don't have to of course, you can keep posting articles with unreadable graphs.

          Comment


          • #6
            Originally posted by duby229 View Post

            I'm clueless on how to fix it
            One fix could be to check for average value, and remove anything that is above 500% of that (or some other value, just making something up here). Or instead of removing, simply zooming in, and settings the data points to the max value on the y-axis.
            Something like this is done in many scientific papers, to avoid focusing on stat fluctuations.

            Comment


            • #7
              Originally posted by TFA
              Unfortunately the Radeon GPU utilization as a percentage isn't exposed in an easy, out-of-the-box parseable method, so here's just the GPU usage difference for the GTX 980 and GTX 1080.
              apt-get install radeontop, though the version in Ubuntu 16.10 is too old for the RX 480. Also, you posted the 1080p ultra gpu usage twice.

              Comment


              • #8
                I'd say a good fix would be to exclude the first and last 0.5% of data (maybe more if necessary) if it varies by more than 25% of the average. Manually cropping the graphs doesn't really help the readability of the graph if it's already been adjusted for the spikes.

                Comment


                • #9
                  Originally posted by schmidtbag View Post
                  Manually cropping the graphs doesn't really help the readability of the graph if it's already been adjusted for the spikes.
                  Yeah, manually cropping graphs is not the solution. (Just to be clear, that's not what I was proposing.)

                  As you say, removing the end points is not a bad idea, as those are more edge-cases. Losing 1% of the dataset is more than acceptable IMO.

                  Comment


                  • #10
                    Originally posted by schmidtbag View Post
                    I'd say a good fix would be to exclude the first and last 0.5% of data (maybe more if necessary) if it varies by more than 25% of the average. Manually cropping the graphs doesn't really help the readability of the graph if it's already been adjusted for the spikes.
                    Usually you cut out data that doesn't fall within 3 sigma.

                    Comment

                    Working...
                    X