Page 10 of 12 FirstFirst ... 89101112 LastLast
Results 91 to 100 of 114

Thread: NVIDIA Says It Has No Plans To Support Wayland

  1. #91
    Join Date
    Jun 2009
    Posts
    2,927

    Default

    Quote Originally Posted by makomk View Post
    I'm not convinced. There appear to be some pretty major technical and legal barriers to NVidia supporting Wayland.
    I don't think that there are any technical barriers for nvidia drivers under Wayland. Nvidia just doesn't see a point in doing it now, and that's quite understandable.

    In particular, Wayland requires KMS support from drivers, which NVidia can't implement due to it requiring GPL-only symbols in the kernel.
    Nvidia blob does modesetting in the kernel. They don't need to tie into the kernel modesetting code, since they do all the modesetting in kernel space inside their proprietary blob already.

    It also assumes the use of DRI2 and GEM, meaning a major redesign of NVidia's driver would be needed and they wouldn't be share nearly as much code with the Windows driver as they currently do.
    You don't really need DRI2 and GEM, but the infrastructure needed for redirected direct rendering, which Nvidia has had for a long time.

    KMS, DRI2, GEM, etc. are open-source technologies which pretty much do the same thing that the binary blob does internally. Nvidia just can't be bothered at this time.

  2. #92
    Join Date
    May 2007
    Location
    Third Rock from the Sun
    Posts
    6,583

    Default

    Quote Originally Posted by pingufunkybeat View Post
    Your post made it sound like the binary-only driver from nvidia was somehow in line with Linus' plans for Linux, while nothing could be further from the truth.
    I did nothing of the sort. I merely pointed out the fact that Linus started linux to fully utilize his hardware. The same reasoning goes for people who use a blob. They don't want to castrate their hardware over the sake of a license.

  3. #93
    Join Date
    Jun 2009
    Posts
    2,927

    Default

    Linus also started Linux in order to learn about kernel programming.

    The exact opposite reasoning than relying on black box blobs for a few FPS and HD video.

    Don't be so selective with your arguments

  4. #94
    Join Date
    May 2007
    Location
    Third Rock from the Sun
    Posts
    6,583

    Default

    Quote Originally Posted by pingufunkybeat View Post
    Linus also started Linux in order to learn about kernel programming.
    So your telling me that you can't learn kernel programming as closed source?

  5. #95
    Join Date
    Aug 2008
    Posts
    84

    Default

    Quote Originally Posted by pingufunkybeat View Post
    Nvidia blob does modesetting in the kernel. They don't need to tie into the kernel modesetting code, since they do all the modesetting in kernel space inside their proprietary blob already.
    They do need to, because Wayland is currently hardcoded to use the kernel modesetting code and a fairly significant chunk of DRI2 directly. What's more, this part of Wayland is itself under the GPL, so it's not like they can fork a modified version that uses their own proprietary modesetting library either.

    Quote Originally Posted by pingufunkybeat View Post
    You don't really need DRI2 and GEM, but the infrastructure needed for redirected direct rendering, which Nvidia has had for a long time.
    That's the minimum requirement for something like Wayland to be used. Wayland itself does actually require DRI2. If they wanted to, AMD might just be able to implement enough of DRI2 to allow Wayland to run on top of X.org on fglrx, but this option really isn't practical for NVidia because they don't actually use DRI. Even then, running Wayland without using X.org as a backend is another matter entirely.

  6. #96
    Join Date
    Jun 2009
    Posts
    2,927

    Default

    OK, in that sense it is technical (currently hard-coded to use DRI2), but there is no reason why this could not be rewritten to use nvidia's stuff providing equivalent functionality.

    The main question is whether there is qa wish to do this, both from the Wayland developers (currently firmly in the open driver camp and funded by Intel) and the nvidia developers (no interest in Wayland at the moment).

    If Wayland becomes relevant, nvidia will release something for it, no doubt.

  7. #97
    Join Date
    Jun 2009
    Location
    Elsewhere
    Posts
    89

    Default

    Quote Originally Posted by DeiF View Post
    NVidia has said many times that they don't want to spend time making drivers/software for rapid changing software/APIs. Prove them that Wayland is not an experiment and is actually the present of Linux graphics, and they will support it.
    It's impossible to know now if Wayland will become the standard some day.
    Quite pragmatic and logical.

  8. #98
    Join Date
    Oct 2008
    Posts
    3,092

    Default

    Quote Originally Posted by deanjo View Post
    Interesting. Most reviews I found are more like this one: http://www.anandtech.com/show/2745/23 which concludes
    If you've got a 30" display then either card will work, it's just up to your preference and the items we talked about earlier. If you've got a 24" or smaller display (1920x1200 or below), then the Radeon HD 4890 is the card for you.
    Anyway, that's comparing a single card rather than the generation as a whole, which is what i meant originally.

  9. #99
    Join Date
    May 2007
    Location
    Third Rock from the Sun
    Posts
    6,583

    Default

    Quote Originally Posted by smitty3268 View Post
    Anyway, that's comparing a single card rather than the generation as a whole, which is what i meant originally.
    That single card beats everything in that generation and wasn't even nvidia's highest offering but yet it beat ati's highest offering (even the 4890). Just google any GTX 275/280/285 benchmarks and you will see the trend.

  10. #100
    Join Date
    Aug 2009
    Posts
    2,264

    Default

    Quote Originally Posted by deanjo View Post
    That single card beats everything in that generation and wasn't even nvidia's highest offering but yet it beat ati's highest offering (even the 4890). Just google any GTX 275/280/285 benchmarks and you will see the trend.
    Not true. The highest offering was a dualcore GPU with double the amount of RAM, namely 4890x2, which is fair because it's on one chip. And it kicked the shit out of every nVidia card when released.

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •