Page 2 of 2 FirstFirst 12
Results 11 to 18 of 18

Thread: UT2K4: ATi Vs nVidia IQ.

  1. #11
    Join Date
    May 2007
    Location
    Third Rock from the Sun
    Posts
    6,587

    Default

    Quote Originally Posted by Thetargos View Post
    There seems to be a serious regression in the nVidia 180.x drivers. The changes in the .ini files indeed yield the expected results, however performance suffers, and a LOT, not what you'd expect to see with a reasonably decent video card. Performance drops to 10 FPS in Onslaught games, granted all settings in High/Highest, but with my previous card (8500GT) and different drivers performance was reasonably (Ons-Torlan yielding about 40 FPS average); 10-20 FPS in this game with a 9800GT card is certainly not what you'd expect to get (indeed the Windows system I borrowed for the other tests is almost identical to my main rig).

    File a bug report with them. I know they are working on SLi improvements and profiles as we speak. Might as well make them aware of it while they are looking at the gaming side of the drivers.

  2. #12
    Join Date
    Apr 2007
    Location
    Mexico City, Mexico
    Posts
    900

    Default

    Will do... Nv News here I come.

  3. #13
    Join Date
    May 2007
    Location
    Third Rock from the Sun
    Posts
    6,587

    Default

    Quote Originally Posted by Thetargos View Post
    Will do... Nv News here I come.
    No, file the bug via email linux-bugs@nvidia.com. Posting in forums gets little results. That way you also get the name of the contact dealing with the bug too.

  4. #14
    Join Date
    Sep 2008
    Posts
    34

    Default Details visible on 4850 + Catalyst 8.11

    I can see the surface details (vines, pond scum, or whatever), running Radeon 4850 with Catalyst 8.11 drivers and xorg-server 1.5.3.

  5. #15
    Join Date
    Oct 2007
    Posts
    912

    Default

    It would be interesting to see if opengl performance is down across the board with nvidia's latest drivers, or if it's just a ut2004 thing. I wonder if opengl 3.0 support would have anything to do with it as well (who knows what internal changes have been made with that).

  6. #16
    Join Date
    Apr 2007
    Location
    Mexico City, Mexico
    Posts
    900

    Default

    Quote Originally Posted by mirv View Post
    It would be interesting to see if opengl performance is down across the board with nvidia's latest drivers, or if it's just a ut2004 thing. I wonder if opengl 3.0 support would have anything to do with it as well (who knows what internal changes have been made with that).
    If anything, the OpenGL 3.0 support would only mean that the necessary routines, extensions, etc have been added to the nVidia ICD (libGL[core].so), and maybe they even have it in a separate library... I have not checked... I'm not sure if support for GL3.0 would brake GL1.5/2.0 support (which AFAIK is what UT2K4 actually requires), I thought GL3.0 was backwards compatible with at least 1.5/2.0 (and if anyone knows better than me, wasn't GL2.0 supposed to only add support for SM3.0 [among other things, of course!], while GL1.5 had up to SM2.1, but in a way 2.0 was only a little improvement over 1.5?)

    At any rate, it could be only with the new Beta drivers that this is present, however it would indeed be interesting to see if this "degradation" in performance is local to UT2K4 and/or other games/apps, and if previous (17x.x) drivers also exhibit the issue... I'm affraid that this would have to be tested with a series 8 card, as my 9800GT was only recently officially supported by the 177.82 driver (AFAIK). I will, however test at least these two driver sets against UT2K4, using the in-game benchmark system and the PTS system for other games (Doom3, Quake4, Nexuiz, Unigine, Lighting... suggestions?)

  7. #17
    Join Date
    Apr 2007
    Location
    Mexico City, Mexico
    Posts
    900

    Default

    Well, I may have found the root of the poor performance in UT2004: Lack of AGP support... That's right. In the log I get a Warning:

    Code:
    Log: WARNING: Couldn't allocate AGP memory - turning off support for GL_NV_vertex_array_range
    Log: WARNING: This has a serious impact on performance.
    So it would seem that the UT binary expects to find an AGP kernel module or some such as otherwise it does not use GL_NV_vertex_array_range and instead uses a really slow fall back.

    I already got some numbers about benchmarks I've done with the array of games I mostly play between 177.82 and 180.08 drivers, when I have all the numbers I'll post (there is a marginal difference in performance it would seem between Q4 and D3)

  8. #18
    Join Date
    Oct 2007
    Posts
    912

    Default

    Well, that would definitely account for the performance hit. Looking forward to the numbers - can't offer any more ideas though (any attempt at intelligent conversation when I'm really tired is generally a bad idea too).

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •