It's true that Crysis favour's nvidia and this thread confirms it:
There's even a patch in thread to show how it sucks on nvidia cards.
I'm not expert but when single 8800gts runs better Crysis then my 2x4870 then something is fishy.And for the record Crysis doesn't scale well or did everyone forget that you couldn't run Crysis on 64-bit machines until 1.3 patch or something and it still crashes.And how the hell even next generation of cards can struggle with this game and call it self good scaling.Tell al least one review where normal PC setup runs it at 60fps I dare you.
Well... At high settings I can defo get 60fps at full res. Setting it to 'enthusiast' (Crysis Warhead on Windows XP) and full res and whatever all high it runs 30fps or so I think. Please note that Warhead's enthusiast setting is what was supposed to be DirectX 10 on Vista.
Originally Posted by kUrb1a
Radeon 5770 with 1GB RAM DDR5
8 GB sys RAM
Costs you nothing if you'd buy my setup now...
How the hell you managed to get 60fps in crysis with that setup.Ati 5770 is something like 4870 in crossfire with 4850 or less.In this link they used Intel x58,Core i7 and 3x2Gb DDR3 ram and this graphic barely managed to 20+ FPS.I'm just curious what voodoo you use.
Not to mention the full flex 8x AA and 8x anastrophic.
Originally Posted by kUrb1a
Windows XP user account (so no admin/root, which load uninfected base XP install and nothing else but drivers)
Loaded systray shizzle: Steam, OpenOffice.org quick starter, Realtek driver stuff, Logitech webcam app, Bluetooth and Logitec Wireless controller game software.
No overclocking. 32bit so effectively 3,79 or so GB RAM. Service pack 3 and latest updates. No virus scanner. 2 year old install of XP. Warhead unpatched.
What's funny is that my onboard HD 3300 IGP and 128MB RAM onboard (yes, onboard!) on my motherboard is probably doing something Crossfire-ish.
I had a 4870x2 before I got this card but it died and yes I didn't had that same performance as I have now (it was lower!).
So there you have it...
I don't think that would help (and I doubt the drivers would let you enable that combination, although I'm not 100% sure).
Originally Posted by V!NCENT
The 5770 has maybe 10x the shader power of the 3300 (and that's being generous to the 3300 :)), so chances are good that the overhead of splitting up the work would outweigh the added performance you got from the extra GPU.
Hmz... Maybe it's the fact that it is Warhead, which was later released than Crysis.
Originally Posted by bridgman
The CryEngine would probably have been tweaked a lot more (it is ofcourse in constant development) en combine that with the driver speed that has gone up since the release of Crysis... Maybe that has something to do with it?
Also the fact that Crysis (not Warhead) got nerved on XP might have something to do with it. For example no multi-core CPU usage and ultra high setting (supposedly DirectX :rolleyes:). Maybe it go de-nerved in Warhead, because I don't have to hack around with the config file to get the uber high settings in XP. There might as well be multi-core CPU usage now?
With Crysis plus config file tweaks to get the 'DirectX 10 effects' I had some serious frame dropping on my 4870x2, with and without the Crossfire and I can't remember Warhead running OK with that card on high settings.
I could film it for proof, but I only have a webcam and a Samsung 8MP phone, so that wouldn't proof much I guess... :(
I was realy amazed though. I saw benchmarks that showed that the 4870x2 should have been way faster...