Page 1 of 4 123 ... LastLast
Results 1 to 10 of 32

Thread: Two Hacks For The NVIDIA Linux Graphics Driver

  1. #1
    Join Date
    Jan 2007
    Posts
    13,422

    Default Two Hacks For The NVIDIA Linux Graphics Driver

    Phoronix: Two Hacks For The NVIDIA Linux Graphics Driver

    A Phoronix reader has shared two NVIDIA binary Linux graphics driver "hacks" he's written for overriding some functionality of the NVIDIA binary blob for GeForce hardware...

    http://www.phoronix.com/vr.php?view=MTQxNjM

  2. #2
    Join Date
    May 2012
    Posts
    674

    Default

    What about the problem I heard that Nvidia cripples their hw (or sw?) when using doubles instead of floats? Can it be fixed only by moving to AMD?

  3. #3
    Join Date
    Dec 2007
    Posts
    2,276

    Default

    an artificial limit on the pixel clock
    I doubt it's an artificial limit. More likely it was the highest value that was validated for the hardware. Going beyond that limit may produce undefined results or non-compliant behavior.

  4. #4
    Join Date
    Sep 2007
    Location
    Connecticut,USA
    Posts
    941

    Default

    Quote Originally Posted by agd5f View Post
    I doubt it's an artificial limit. More likely it was the highest value that was validated for the hardware. Going beyond that limit may produce undefined results or non-compliant behavior.
    Overclocking can also cause abnormal behavior as well but these hacks can allow gamers and enthusiasts to push their hardware further at their own risk

  5. #5
    Join Date
    Dec 2007
    Posts
    2,276

    Default

    Quote Originally Posted by DeepDayze View Post
    Overclocking can also cause abnormal behavior as well but these hacks can allow gamers and enthusiasts to push their hardware further at their own risk
    Right, but in both cases, the limits are not artificial. That is the distinction I'm trying to make. Artificial implies that the limit is arbitrary and less than some "real" limit. By overclocking you are going beyond the validated limits of the hardware into undefined territory.

  6. #6
    Join Date
    Jul 2013
    Posts
    8

    Default

    Quote Originally Posted by agd5f View Post
    Right, but in both cases, the limits are not artificial. That is the distinction I'm trying to make. Artificial implies that the limit is arbitrary and less than some "real" limit. By overclocking you are going beyond the validated limits of the hardware into undefined territory.
    I think the limit used to be real... When I tried this on my old 9800 GTX+ rig, dmesg spewed out some sort of hardware error if I tried to request a clock even 1 KHz above the 400 MHz limit.

    Now on Kepler (and maybe Fermi as well?) the hardware allows setting pixel clocks greater than 400 MHz - I have my monitors at 462.84 without a single problem - but the kernel driver will refuse to generate timings if you ask for a clock greater than that... probably a relic from the Tesla days.

    What's silly is the fact that they even bothered to put a pixel clock limit in the kernel driver. The X driver already checks to make sure your pixel clock doesn't exceed 330 MHz, but allows you to disable it with the "NoMaxPClkCheck" ModeValidation option. Anyone trying to set more than 400 MHz will have to include that option in their xorg.conf anyway, so they'll obviously be aware that they're going past the DVI spec already.

  7. #7
    Join Date
    Aug 2008
    Posts
    99

    Default

    Quote Originally Posted by CFSworks View Post
    Now on Kepler (and maybe Fermi as well?) the hardware allows setting pixel clocks greater than 400 MHz - I have my monitors at 462.84 without a single problem - but the kernel driver will refuse to generate timings if you ask for a clock greater than that... probably a relic from the Tesla days.
    What monitors are you using? Is there a list of monitors that will accept high-rate signals at high resolution? Ever since I switched from CRTs to LCDs, I've longed for refresh rates greater than 60Hz at maximum resolution. I really miss running at 120-150Hz on a CRT.

  8. #8
    Join Date
    Nov 2007
    Posts
    1,353

    Default

    Refresh rates on LCDs are a bit different. The duty cycle is a lot different. If you are familiar with square waves you'll know what I'm talking about. Each pixel is "lit" for a longer portion of the cycle than an old CRT would be. As such the refresh rate on LCDs is less important or rather it is equivalent to a higher rate.

  9. #9
    Join Date
    Aug 2008
    Posts
    99

    Default

    Quote Originally Posted by duby229 View Post
    Refresh rates on LCDs are a bit different. The duty cycle is a lot different. If you are familiar with square waves you'll know what I'm talking about. Each pixel is "lit" for a longer portion of the cycle than an old CRT would be. As such the refresh rate on LCDs is less important or rather it is equivalent to a higher rate.
    I do understand the differences between LCDs and CRTs. The scan rate from a video card is still called "refresh" rate, though, and I want moar framez.

  10. #10
    Join Date
    Jul 2013
    Posts
    8

    Default

    Quote Originally Posted by unix_epoch View Post
    What monitors are you using? Is there a list of monitors that will accept high-rate signals at high resolution? Ever since I switched from CRTs to LCDs, I've longed for refresh rates greater than 60Hz at maximum resolution. I really miss running at 120-150Hz on a CRT.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •