Announcement

Collapse
No announcement yet.

The NVIDIA vs. Open-Source Nouveau Linux Driver Benchmarks For Summer 2018

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • The NVIDIA vs. Open-Source Nouveau Linux Driver Benchmarks For Summer 2018

    Phoronix: The NVIDIA vs. Open-Source Nouveau Linux Driver Benchmarks For Summer 2018

    It has been some months since last delivering any benchmarks of Nouveau, the open-source, community-driven for NVIDIA GPUs. The reason for not having any Nouveau benchmarks recently has largely been due to lack of major progress, at least on the GeForce desktop GPU side, while NVIDIA has continued to contribute on the Tegra side. For those wondering how the current performance is of this driver that started out more than a decade ago via reverse-engineering, here are some benchmarks of the latest open-source Nouveau and NVIDIA Linux graphics drivers on Ubuntu.

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2

    Comment


    • #3
      What about some reverse engineering with proprietary driver, just catch instructions for re-clocking and replicate them, impossible?

      Comment


      • #4
        Originally posted by ruthan View Post
        What about some reverse engineering with proprietary driver, just catch instructions for re-clocking and replicate them, impossible?
        Ilia Mirkin posted a good summary of the situation and constraints here on Phoronix.

        Comment


        • #5
          Great article, thanks Michael!

          Comment


          • #6
            Originally posted by ruthan View Post
            What about some reverse engineering with proprietary driver, just catch instructions for re-clocking and replicate them, impossible?
            AFAIK You would either infringe copyright or the firmware would not gain access to re-clocking, I don't think there's a way around it. One could physically modify the card to replace the logic that verifies signatures? If you magically got it working and Nvidia found out about it then they will prevent that method in future cards as they don't want people to own hardware, they just want people to use their hardware.

            TLDR; https://cdn.arstechnica.net/wp-conte...ia-640x424.jpg

            Edit: rhysk has a better answer, see Post #4
            Last edited by Jabberwocky; 14 June 2018, 01:22 PM.

            Comment


            • #7
              Year of linux desktop: 2023

              Back to the topic, this is really impressive it sound more and more like nouveau will be the new mesa for Nvidia in a couple of years.
              Last edited by Jahimself; 14 June 2018, 03:19 PM.

              Comment


              • #8
                "what a ROUT!". Nouveau doesn't even come close. That's any sane person's initial reaction.

                But when we think about this, and consider the hard working and talented people behind Nouveau, the obvious conclusion is that this is indeed a rout, for Nvidia.

                How can a company be so tragically closed, and closed minded, so dismissive of its users, that it would cripple an open source project to this extent, through this unjustifiable holding back of information.

                Nvidia might like to take a page out of Satya Nadella's Microsoft's book. Not perfect, but a zillion miles away from Nvidia at this point.

                Comment


                • #9
                  Originally posted by Jabberwocky View Post

                  AFAIK You would either infringe copyright or the firmware would not gain access to re-clocking, I don't think there's a way around it. One could physically modify the card to replace the logic that verifies signatures? If you magically got it working and Nvidia found out about it then they will prevent that method in future cards as they don't want people to own hardware, they just want people to use their hardware.

                  TLDR; https://cdn.arstechnica.net/wp-conte...ia-640x424.jpg

                  Edit: rhysk has a better answer, see Post #4
                  Oh you can own the hardware, there's nothing they can really do to prevent you from owning and doing what ever you want to it. First Sale doctrine applies to physical goods. But thanks to our legal system in regards to copyright and the UCC you don't own the software that enables it to run. And to be fair, faik there's no fully opensource AtomBIOS nor Intel firmware system implementation from those respective vendors either. The difference is that AMD is somewhat more open source friendly allowing AtomBIOS to be more permissively used while Intel is schizophrenic at best. It's not black and white, it's shades of gray.

                  I personally have a GeForce card in my desktop, because at the time AMD's FGLRX driver was... pretty terrible and I was tired of dealing with it. Nvidia's proprietary driver at least works from a gamer's point of view (for me). So when my AMD 7850 died, I had to make a choice, the AMDGPU drivers weren't a thing then and I had no confidence they'd be any better than what'd come before. So I went with Nvidia along with a lot of people, including GPGPU users. I haven't regretted that choice. Would I do the same thing now, with AMDGPU apparently proceeding at a good pace? Maybe. I honestly don't know. The reason being game devs often only write for single hardware platforms, and more often than not it's Nvidia they're being paid to write for. It becomes the path of least trouble for gamers and the Steam hardware stats tend to confirm an Nvidia bias.

                  Comment


                  • #10
                    Originally posted by stormcrow View Post
                    game devs often only write for single hardware platforms, and more often than not it's Nvidia they're being paid to write for.
                    Except that AMD is in PS4, XBox One, and now the upcoming (Linux-based) Atari console. The only non-PC platform Nvidia has is Nintendo Switch, which basically uses a tablet SoC - anything written for it will fly, on a PC.

                    So, for cross-platform titles and those using cross-platform engines (Unity & Unreal), I see an advantage for AMD. Their biggest problem is the relative inefficiency of their hardware. If a future generation would be the "Zen" of GPUs, they could truly dominate. Of course, Nvidia will probably have stopped caring about gamers, by then.
                    Last edited by coder; 14 June 2018, 11:59 PM.

                    Comment

                    Working...
                    X