Page 2 of 3 FirstFirst 123 LastLast
Results 11 to 20 of 23

Thread: NVIDIA's Working On A New Driver Architecture?

  1. #11
    Join Date
    Oct 2009
    Posts
    167

    Default

    Quote Originally Posted by myxal View Post
    Perhaps I missed it, but does the current nvidia driver provide any facility for run-time resolution/orientation/multi-monitor adjustments? As a laptop user, lack of this feature alone makes the driver completely worthless to me.
    Should do... The nVidia settings applet does this for me.....

  2. #12
    Join Date
    Oct 2006
    Location
    Israel
    Posts
    555

    Default

    Quote Originally Posted by ethana2 View Post
    I thought KMS was more of a religious or political issue than a technical one. I understand KMS to be something that nVidia simply cannot be reasonably expected to provide in the near future, given that probably 90% of all the IP used in their hardware and software probably isn't even owned by them, but people more like the team behind abominations like Poulsbo.
    You are dead wrong - KMS is anything but political.
    KMS is required in-order to ship root-less X.org (read: X.org with no suid). Without KMS, Xorg requires root-access in-order to load and configure the driver(s).
    As it stands X.org is a fairly big source of privilege escalation exploits.

    As for the "nVidia simply cannot" part, neither you nor me have any means to make this assertion as neither of us have access to their code. Never the less, nVidia doesn't have technical reason not support KMS (beyond resource issues), as nouveau already supports it.

    - Gilboa
    DEV: Intel S2600C0, 2xE52658V2, 32GB, 4x2TB, GTX680, F20/x86_64, Dell U2711.
    SRV: Intel S5520SC, 2xX5680, 36GB, 4x2TB, GTX550, F20/x86_64, Dell U2412..
    BACK: Tyan Tempest i5400XT, 2xE5335, 8GB, 3x1.5TB, 9800GTX, F20/x86-64.
    LAP: ASUS N56VJ, i7-3630QM, 16GB, 1TB, 635M, F20/x86_64.

  3. #13
    Join Date
    Feb 2008
    Posts
    111

    Default

    Quote Originally Posted by kayosiii View Post
    Should do... The nVidia settings applet does this for me.....
    Does the app make adjustments on-the-fly though (which is what I mean by run-time? Last time I saw it (6 months ago) it was basically a beefed-up xorg.conf editor frontend and any adjustments beyond resolution changes required restarting X.

  4. #14
    Join Date
    Oct 2009
    Posts
    167

    Default

    Quote Originally Posted by myxal View Post
    Does the app make adjustments on-the-fly though (which is what I mean by run-time? Last time I saw it (6 months ago) it was basically a beefed-up xorg.conf editor frontend and any adjustments beyond resolution changes required restarting X.
    Seems to work that way for me I use it when I hook my laptop up to an external screen It can do this without requiring an X restart. It can be used to modify your X config settings on startup. But other than that everything works live.

  5. #15
    Join Date
    Jul 2007
    Posts
    176

    Default

    @ myxal and kayosiii: Yes, the nvidia-settings gui application shipped with the driver does on-the-fly stuff like multi-montior setups.

    As a side note, if you need a command-line utility that is more convenient and functional (for people like me), checkout this awesome tool called "disper": http://willem.engen.nl/projects/disper/ .

  6. #16
    Join Date
    Feb 2008
    Posts
    111

    Default

    @hdas: Intriguing. Sounds like I should have given the FX5200 another whirr before ditching it. Ah well...

  7. #17
    Join Date
    Dec 2010
    Posts
    20

    Default

    Quote Originally Posted by elanthis View Post
    Sounds a lot more than they're just slowly refactoring their existing driver.
    Precisely.

    Whole big article just because NVidia engineer put "architecture" word in his sentence. Not even "new architecture". Two pages of pure speculations. And you then wonder why officials don't want to comment on various things when they need to watch every single word they say.

  8. #18
    Join Date
    Dec 2010
    Posts
    20

    Default

    Sorry for double post, but apparently can't edit post after 1 minute.

    Just wanted to add that article title suggests that some revolutionary changes might be ongoing while after reading the quote it just looks like they are refactoring, and not necessarily the whole driver, might be just the part that was mentioned.

  9. #19
    Join Date
    Nov 2009
    Location
    Europe
    Posts
    270

    Default

    In some games e.g. Regnum Online there was a huge performance regression between NVIDIA driver 19x and 25x/26x drivers. Regnum Online uses shaders, but even with disabled shaders the performance regression exists (here is a very short thread in the NVIDIA forum about this issue, it was close very early).

    I wonder why and also if other more advanced Linux games are affected by this performance regression.

  10. #20
    Join Date
    Oct 2007
    Posts
    912

    Default

    Quote Originally Posted by Fenrin View Post
    In some games e.g. Regnum Online there was a huge performance regression between NVIDIA driver 19x and 25x/26x drivers. Regnum Online uses shaders, but even with disabled shaders the performance regression exists (here is a very short thread in the NVIDIA forum about this issue, it was close very early).

    I wonder why and also if other more advanced Linux games are affected by this performance regression.
    If nvidia are internally rewriting the drivers to a new architecture, the focus is on getting said new architecture working first, then increasing performance, so I wouldn't really call it a performance regression - it's not a bug. The benefit is of course that the driver should be able to be better maintained in the future, and can likely handle and provide new features easier.

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •