Announcement

Collapse
No announcement yet.

Are Open-Source GPU Drivers Sufficient For 4K Linux Gaming?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Are Open-Source GPU Drivers Sufficient For 4K Linux Gaming?

    Phoronix: Are Open-Source GPU Drivers Sufficient For 4K Linux Gaming?

    Last week I published the results of a 15-way AMD/NVIDIA GPU comparison for 4K Linux gaming that was centered around the proprietary AMD/NVIDIA graphics drivers. However, if you stick to using open-source Mesa/Gallium3D drivers and are a Linux gamer, here are some benchmark results comparing the open to closed-source driver performance at 3840 x 2160.

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    I saw no mention of the Intel graphics you've been raving about a bunch lately. Is that hardware not capable of running a 4K screen?

    Comment


    • #3
      Nvm I'm blind!

      Comment


      • #4
        The gamer/computer magazines consider playing AAA Windows games on 4K so demanding they recommend multiple graphics cards except in the cases of the newest Titan-based Nvidia cards and I think a couple top level AMD cards. They also advise expecting to spend more on the GPU's to drive the monitor than on the monitor itself. The cost alone of those GPU's would still force most of these users to closed drivers, as a 20% performance penalty in a $1,000 GPU is equivalent to losing $200. In a $90 GPU on a 1080p open only gaming system this penalty is only $18, on a desktop w/o games using AMD Fusion it is essentially zero dollars. There is also the fact that the open drivers don't support multi cards rendering to one screen in Xorg. Still a hell of an interesting benchmark, as this puts the GPU and the driver to the most demanding possible test. The REALLY interesting thing as that an open source game (Teserract) was playable with an AMD card and the open driver, while every last one of the Steam/paid games was just too fat. We are a long, long way from 4K being practical from a power consumption/whole room heat standard, GPU cost standard, or even video playback standard. In fact just plain video playback would be a full load for any of my systems, winding up the CPU fans wide open because VPDAU hardware support on my r600 cards only goes to 1080p. That's with H264, not the doubled CPU load of H265.

        Comment


        • #5
          I know you were directly comparing performance to cost of the card, but honestly at the price of those cards a 20% perf penalty would be more like losing $500 of the card's value, since you could get 20% less GPU for about $500 less. Of course that's a rough estimate off the top of my head--if someone took the time to count shaders/pipelines/etc. and compare them between cards, one with 20% less of those would probably be more or less than I've estimated.

          Comment


          • #6
            Originally posted by Luke View Post
            The gamer/computer magazines consider playing AAA Windows games on 4K so demanding they recommend multiple graphics cards except in the cases of the newest Titan-based Nvidia cards and I think a couple top level AMD cards. They also advise expecting to spend more on the GPU's to drive the monitor than on the monitor itself.
            To be honest, nobody in their right mind would build a 4K, AAA game ready machine to play in Linux. Not with the current state of affairs.

            Comment


            • #7
              Nice test, NV cards have 4.1 but are broken OSS, through CSS seems to be better.

              What I would be interrested in is X Plane 10 Benchmarks, since it's the only thing I would upgrade my Hardware for. If Linux can handle it there is no point in using Windows.

              Comment


              • #8
                I do not agree with the statement in the article that 35 fps is not playable and 60 fps is "roughly" playable. In my experience 35 fps is just fine and I can play any game without problems. And when you get 60 fps, then it's smooth as hell.

                Comment


                • #9
                  Originally posted by joh??n View Post
                  I do not agree with the statement in the article that 35 fps is not playable and 60 fps is "roughly" playable. In my experience 35 fps is just fine and I can play any game without problems. And when you get 60 fps, then it's smooth as hell.
                  It depends on the resolution being used and also on your physical distance from the screen. The misconception of 30 fps being smooth comes from back in the day when we used 640x480. The lower the resolution, the smoother you feel low fps. Also the farther you are from the screen the smoother you see low fps. On the other hand, the more details there are and the closer you are, you can see more of it and it'll be more distracting.

                  Comment


                  • #10
                    Originally posted by joh??n View Post
                    I do not agree with the statement in the article that 35 fps is not playable and 60 fps is "roughly" playable. In my experience 35 fps is just fine and I can play any game without problems. And when you get 60 fps, then it's smooth as hell.
                    It all depends on what game you're playing, along with personal preference.

                    If I got 35fps on a game like Xonotic or StepMania for example, that's unplayable to me. 35fps on something like Guild Wars 2 or Skyrim though is fine for me. Generally speaking, anything competitive and/or requiring fast reaction times, I need the FPS above my monitor's refresh rate (60)

                    Comment

                    Working...
                    X