Page 4 of 4 FirstFirst ... 234
Results 31 to 37 of 37

Thread: Nvidia blames nvidia for driver cheating in H.A.W.X. 2

  1. #31
    Join Date
    Oct 2007
    Location
    Under the bridge
    Posts
    2,146

    Default

    Quote Originally Posted by Plombo View Post
    The game *says* it's using 4xAA when this happens, so that would be a completely braindead design decision. Why would they *intentionally* design incorrect behavior into their software?
    Technically, this *is* 4xAA. The name is correct, there are only 4 color samples.

    From Nvidia's own website:
    CSAA mode: 16x
    #of color/z/stencil samples: 4

    [...]

    CSAA can be enabled in one of the two following scenarios:
    When the application specifically enables CSAA.
    When CSAA is turned on in the control panel by the end user. [...]

    It is recommended that applications enable CSAA directly. The process of implementing CSAA is incredibly simple (almost exactly the same as implementing MSAA), and the benefit to your users is enormous; they get much higher image quality at almost zero performance loss, without the fuss and hassle of manipulating the control panel.
    Looks like Nvidia is not being very honest here.

  2. #32
    Join Date
    Jun 2009
    Posts
    28

    Default

    Quote Originally Posted by Qaridarium View Post
    i really likes your writing kano

    hopyfully debian 6 comes out fast thats because i like to test your new debian 6 based distry

    on your physX statement last time i play borderlance an physX game but i can play the game perfect on amd hardware in linux...

    means physX isn't sooo magic you only need a fast cpu i play it on an phenomII3,6ghz

    means no one need a nvidia card for physX...

    and hey i don't care about bluerays and video acceleration

    the catalyst 11.1 will be fine for my usage
    PhysX was designed so that it can fall back on the CPU if you don't have an nVIDIA card. So yes you don't _need_ an nVIDIA card, but without one there is a lot more strain on the CPU, that and the CPU version of PhysX isn't as accurate as when it runs on the GPU.

    Nobody said its magic, it just works better on with an nVIDIA card which can be extemely helpful when you don't have a PhenomII 3.6 GHz in your machine.

  3. #33
    Join Date
    Nov 2008
    Location
    Germany
    Posts
    5,411

    Default

    Quote Originally Posted by BlackStar View Post
    Technically, this *is* 4xAA. The name is correct, there are only 4 color samples.

    From Nvidia's own website:


    Looks like Nvidia is not being very honest here.
    *Happy* LOL... don't beat nvidia so hard

  4. #34
    Join Date
    Oct 2007
    Location
    Under the bridge
    Posts
    2,146

    Default

    Quote Originally Posted by admax88 View Post
    PhysX was designed so that it can fall back on the CPU if you don't have an nVIDIA card. So yes you don't _need_ an nVIDIA card, but without one there is a lot more strain on the CPU, that and the CPU version of PhysX isn't as accurate as when it runs on the GPU.

    Nobody said its magic, it just works better on with an nVIDIA card which can be extemely helpful when you don't have a PhenomII 3.6 GHz in your machine.
    Yeah, especially since Nvidia has intentionally crippled the CPU implementation of PhysX:

    - you may have heard that little thing called SSE which can contribute to a ~4x speed-up when used correctly? PhysX is not using it. It's relying on the x87 FPU that dates back to 1980.

    - or that little thing that's called "multi-threading" that can scale performance by your number of CPU cores? Nope, PhysX is limited to a single thread.

    This goes to show why vendor lock-in to a closed-source product may not be that bright an idea. Now, Nvidia has announced this will all be fixed in some magical "3.0" released - but there's no roadmap or release date to rely on. In short, bollocks.

    Bullet is superior to PhysX in pretty much everyting *except* the Nvidia kickbacks for marring yet another game with the PhysX mark. And to think that Ageia used to be at the forefront of technology and innovation before being swallowed up...

  5. #35
    Join Date
    Apr 2010
    Posts
    1,946

    Default

    Quote Originally Posted by BlackStar View Post
    Implying this was a bug in the first place.

    Ever considered the game might be using high-quality AA *by design*?
    Somewhere in (possible) HAWX code:
    if(NVIDIA){
    disable(CSAA)
    }


    Eventually it ended Nvidia patching own drivers, disabling own "improvements", rather than folks at Ubisoft moving single finger.

    This is cool when drivers carry tons of unreleated legacy patches. This is very windows way - covering by antivirus instead of patching the originating software. The only way from it is having opensource code, patchable by anyone(or someone thats alive and cares).

    The game is only about high-quality pixels *by design* and nothing more. A much more older LockOn is ridiculously much more realistic and using very detailed graphics. HAWX is arcade garbage, now with extra bugs. If this is the answer you were seeking :P

  6. #36
    Join Date
    Oct 2007
    Location
    Under the bridge
    Posts
    2,146

    Default

    Quote Originally Posted by crazycheese View Post
    [stuff]
    Sorry, what you wrote doesn't make any sense.

    Nvidia recommends game developers to enable CSAA, since, I quote: "the benefit to your users is enormous; they get much higher image quality at almost zero performance loss".

    Ubisoft followed Nvidia's recommendation to the letter.

    No matter how you spin it, Nvidia is the dishonest party here (to put it mildly). If CSAA really carried "almost zero performance cost" they wouldn't hack their drivers to disable it. Now that they did hack it, they should have said so directly ("our cards take a serious performance hit with CSAA, so we disabled it for this game") and offered an option to control this behavior in their control panel, like *Ati* does with Catalyst AI.

    To put it in another way, if I code an application to use feature X, I don't want the drivers to second-guess me and sneakily give me lower-quality feature Y instead. It's not the first time Nvidia removes features from Ubisoft games to make their cards appear better than they actually are (recall DX10.1 Assassin's Creed?)

  7. #37
    Join Date
    Nov 2008
    Location
    Germany
    Posts
    5,411

    Default

    Quote Originally Posted by BlackStar View Post
    Sorry, what you wrote doesn't make any sense.

    Nvidia recommends game developers to enable CSAA, since, I quote: "the benefit to your users is enormous; they get much higher image quality at almost zero performance loss".

    Ubisoft followed Nvidia's recommendation to the letter.

    No matter how you spin it, Nvidia is the dishonest party here (to put it mildly). If CSAA really carried "almost zero performance cost" they wouldn't hack their drivers to disable it. Now that they did hack it, they should have said so directly ("our cards take a serious performance hit with CSAA, so we disabled it for this game") and offered an option to control this behavior in their control panel, like *Ati* does with Catalyst AI.

    To put it in another way, if I code an application to use feature X, I don't want the drivers to second-guess me and sneakily give me lower-quality feature Y instead. It's not the first time Nvidia removes features from Ubisoft games to make their cards appear better than they actually are (recall DX10.1 Assassin's Creed?)
    o man i love you

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •