Proof? I haven't had any problems... The best thing the 220 has going for it on windows is the brand name - lots of people will buy it just because it says NVIDIA on the box, without looking at what they're actually getting.
Originally Posted by GT220
i dont know about you guys, but i dont go a day without running the latest windows 64 bit drivers on my ubuntu machine. The latest drivers are required for valid results.
Like I said, I think NVIDIA can make a good case for these cards on Linux. For the same price you've got 3 distinct options:
1. ATI with horrible/buggy 3d performance + 0SS drivers
2. ATI with good (relatively) performance with binary drivers that are still incomplete/buggy
3. NV with mediocre performance but features like VDPAU in the solid binary drivers.
I suspect NVidia would probably win the majority of the marketshare based on the 3 options above. The problem is that in Windows the drivers are fairly even and NV just can't match up with the hardware.
Oh I know about Charlie, Like anyone his word should be taken with a grain of salt, but it doesn't change the fact that even he is right on the money sometimes. As for Beyond3D, I wouldn't know, they aren't one of my usual haunts. I perfer HardOCP, Guru3D, MadShrimps, Jonny Guru, Xtreme Systems and OCF for my tech info.
Originally Posted by GT220
Loss of the chipset market is by no means irrelevant, especially since it only leaves you with a delayed series of GPU, and the hope that the next gen Gameboy doesn't flop, not everything Nintendo touches is gold, remember the that the N64 did nowhere near what was expected and that the Virtual Boy is the bastard stepchild they keep locked under the basement stairs. Lets also not forget that the 800Lb gorilla in the room, Intel, is looking to take the GPU market by releasing their own GPU for the low to mid range, I wouldn't expect the Larabee to take the high end till at least it's 3rd generation, but Intel having a GPU in the market that is anything better then an IGP means that they will be able to further leverage themselves till they force everyone else out.
VIA was driven out by failing at just about everything they did since they made S3TC. Though thats not to say with a 3rd party chipset and GPU that a scaled up Nano couldn't get them back into the low end desktop market.
Who says I'm not planning on going with ATI/AMD on my next build? I'm looking for a buyer for this box first.
So...Does this mean I win by default or something? Oh and before I forget http://wiki.x.org/wiki/RadeonFeature Just waiting on the prerequisites to fall into place and the driver should come together quite quickly.
Originally Posted by GT220
This card has 48 cores??? Shouldn't it be getting higher benchmarks?
Maybe the drivers or benchmarking software are not taking advantage of this? Not a bad price for a new low end card.
I have a GTX 285 now and I'm pretty sure I'm not using both cores in Jaunty Jackalope with the 185.18.36 driver.
Can't wait to see what the Nvidia open source drivers can do!
Nvidia is starting to look worryingly similar to 3dfx when they tried to release the Voodoo 6000 (albeit vastly larger and more efficient). I'm sure some people are rubbing their hands with glee now - Nvidia acquired a lot of bad karma and ill-will when they bought 3dfx...
Come on, why are you even paying attention to this shill? He registered just to advertise the GT220, you won't find any truth to his words that's not twisted to hell and back.
Originally Posted by smitty3268
For example this "shitty drivers on Windows" argument. He forgot to mention that Nvidia drivers were responsible for 3x times as much crashes as Ati drivers in 2007 (28.8% vs 9.3%). (If you have any more recent statistics, I'd love to hear them!)
My personal experience is that NV drivers were beyond shitty on Vista for over a year after its release (Ati drivers worked pretty much fine since Vista RC1). Even now, multi-monitor support is a crapshoot with Nvidia's drivers forgetting or setting the wrong resolution time after time (Ati's drivers work correctly in the same setup).
Simply put, no. That's why you shouldn't listen to anything that comes out of a shill's mouth without a huge barrel of salt (1024 threads, oh wow!)
Originally Posted by cliff
This is a low end card with low end performance. For comparison, Ati's flagship (the 5870) has 1600 cores, each one more capable than this card's paltry 48. Of course, this doesn't say the whole story either: the 5870 uses GDDR5 memory (vs GDDR2 or 3 for this card) and its architecture is vastly different, both more efficient and more capable (shader model 5 vs 4.1).
As I said, you won't find any truth in a shill's words that's not twisted beyond any recognition.
The bottom line is that this card is a marginal improvement over Nvidia's previous low-end hardware. It's mostly an attempt to increase margins for Nvidia and tide them over to the GT3xx release. There's no point in upgrading if you are already using a 8600/9500/9600 and there's little point in putting this card into an HTPC as long as it has a fan.
I'd only recommend this card if (a) you can find it fanless and (b) you want to build a low-powered Linux box. In all other cases, there are better choices than the GT220:
- Nvidia's 9500 for a fanless, low-power Linux box.
- Ati's 4670 for a fanless, low-power Windows box that can also play some games.
- Ati's 5xx0 series for more serious gaming.
Edit: spelling mistakes...
Last edited by BlackStar; 10-20-2009 at 05:14 AM.
In about 2,3 months nvidia knows exactly what the marked share for lowend videocards are on linux. No informed window user would buy this card.
Last edited by tmpdir; 10-20-2009 at 06:31 AM.
The ATI result look like V-Sync.
With AMD's Windows drivers, H.264 decoding via DXVA is restricted to level 4.1, while NVidia's drivers can decode up to level 5.0 (like VDPAU). I guess this is just a software issue though.
It's clear that AMD has the superior hardware at the moment, but with the crappy driver situation it simply is a no-go.