Page 1 of 2 12 LastLast
Results 1 to 10 of 11

Thread: NVIDIA 2010 Driver Year In Review

  1. #1
    Join Date
    Jan 2007
    Posts
    14,810

    Default NVIDIA 2010 Driver Year In Review

    Phoronix: NVIDIA 2010 Driver Year In Review

    At the end of each year for the past five years we have delivered "year in review" articles looking at the performance of NVIDIA's (and ATI/AMD's) proprietary Linux drivers. Both in terms of new features introduced during the year in their driver updates and benchmarking the driver releases to see how the performance has evolved over twelve months. With 2010 coming to an end, it is time for this year's driver reviews. We are starting this year seeing how the NVIDIA performance has matured in 2010.

    http://www.phoronix.com/vr.php?view=15558

  2. #2
    Join Date
    Dec 2009
    Location
    Italy
    Posts
    176

    Default

    Quote Originally Posted by phoronix View Post
    Phoronix: NVIDIA 2010 Driver Year In Review

    At the end of each year for the past five years we have delivered "year in review" articles looking at the performance of NVIDIA's (and ATI/AMD's) proprietary Linux drivers. Both in terms of new features introduced during the year in their driver updates and benchmarking the driver releases to see how the performance has evolved over twelve months. With 2010 coming to an end, it is time for this year's driver reviews. We are starting this year seeing how the NVIDIA performance has matured in 2010.

    http://www.phoronix.com/vr.php?view=15558
    I think that the strange VDPAU spike in 269.19.21 is caused by a bug that affected the 260 series sice the beginning and that caused a massive amount of ram to be allocated on VDPAU init. In fact VDPAU was unusuable on relatively low-memory systems (1GB) until 260.19.26. Details are on nvnews.

  3. #3
    Join Date
    Dec 2009
    Posts
    492

    Default

    The test is a bit flawed. If there were any performance improvements in nvidia drivers, I don't expect them to apply to the 9 series. Hell, they're probably not optimizing for the 200 series anymore.
    In short, you should have used a GTX400-something instead of the 9800GTX.

  4. #4

    Default

    Quote Originally Posted by bug77 View Post
    The test is a bit flawed. If there were any performance improvements in nvidia drivers, I don't expect them to apply to the 9 series. Hell, they're probably not optimizing for the 200 series anymore.
    In short, you should have used a GTX400-something instead of the 9800GTX.
    Fermi doesn't work with all of the driver releases tested...

  5. #5
    Join Date
    Dec 2009
    Posts
    492

    Default

    Quote Originally Posted by Michael View Post
    Fermi doesn't work with all of the driver releases tested...
    Yes, I forgot about that. How could it, it wasn't even released one year ago.

    Time permitting, maybe stick to the drivers that do support Fermi and retest in a future article?

  6. #6
    Join Date
    Dec 2010
    Posts
    20

    Default

    "In this game, which is more demanding than Warsow/OpenArena, there was also not the performance regression found between the 195.36.15 and 196.36.24 driver updates."

    But the regression found for earlier tests was found between 195.30 and 196.36.15. 195.30 wasn't tested for ETQW and later tests.

  7. #7
    Join Date
    Jun 2010
    Posts
    219

    Default

    Quote Originally Posted by bug77 View Post
    Time permitting, maybe stick to the drivers that do support Fermi and retest in a future article?
    Testing on the 9800GTX was actually about as valid as you're going to get. I would fit NVIDIA architecture into the following "generations":

    1: GeForce 200,300 Series

    2: GeForce 400 Series

    3: GeForce 5000 Series (There were some PCI Express variants of the 5k series)

    4: GeForce 6000 Series (SLI was introduced, PCI Express became much more mainstream)

    5: GeForce 7000 Series (After this series AGP was more or less phased out)

    6: GeForce 8000, 9000, 200 Series

    7: Fermi

    All the drivers used permit "generations" 4-6, and the later drivers permit Gen7 (Fermi). I think any of Gen6 would be valid tests, while anything Gen5 or lower probably isn't being optimized anymore.

  8. #8
    Join Date
    Jan 2010
    Location
    Portugal
    Posts
    945

    Default

    Quote Originally Posted by kazetsukai View Post
    1: GeForce 200,300 Series

    2: GeForce 400 Series

    3: GeForce 5000 Series (There were some PCI Express variants of the 5k series)
    Shouldn't that be:

    1: Geforce 2 series
    2: Geforce 3,4 series (IIRC geforce3 is more close to 4 than to geforce2)
    3: Geforce FX 5000 series

    ?

  9. #9
    Join Date
    Jun 2010
    Posts
    219

    Default

    Quote Originally Posted by devius View Post
    Shouldn't that be:

    1: Geforce 2 series
    2: Geforce 3,4 series (IIRC geforce3 is more close to 4 than to geforce2)
    3: Geforce FX 5000 series

    ?
    You're quite correct, oops. Its been a while.

  10. #10
    Join Date
    Dec 2009
    Posts
    492

    Default

    Quote Originally Posted by kazetsukai View Post
    All the drivers used permit "generations" 4-6, and the later drivers permit Gen7 (Fermi). I think any of Gen6 would be valid tests, while anything Gen5 or lower probably isn't being optimized anymore.
    I've been using nothing but nvidia since the 6000 series and I'm fairly certain you don't get performance improvements 6 months after the release of a new generation. Maybe for SLI, but for single cards setup I've never noticed anything. Which is ok, it just means we get proper support asap, not 2 years after we buy the card.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •