Page 3 of 9 FirstFirst 12345 ... LastLast
Results 21 to 30 of 87

Thread: NVIDIA GeForce GT 220

  1. #21
    Join Date
    Oct 2008
    Posts
    2,904

    Default

    Quote Originally Posted by GT220 View Post
    GT 220 is competitive enough. Even on Windows ATI sucks badly in the driver dept, Nvidia's DXVA decoding support and compatability is far superior to ATI's. There's plenty of H.264 encodes that ATI cannot decode on their shitty UVD/UVD2, that the Nvidia first generation VP2 can do with the latest Nvidia drivers.
    Proof? I haven't had any problems... The best thing the 220 has going for it on windows is the brand name - lots of people will buy it just because it says NVIDIA on the box, without looking at what they're actually getting.

  2. #22
    Join Date
    Apr 2009
    Location
    Toronto/North Bay Canada
    Posts
    877

    Default

    i dont know about you guys, but i dont go a day without running the latest windows 64 bit drivers on my ubuntu machine. The latest drivers are required for valid results.

  3. #23
    Join Date
    Oct 2008
    Posts
    2,904

    Default

    Like I said, I think NVIDIA can make a good case for these cards on Linux. For the same price you've got 3 distinct options:

    1. ATI with horrible/buggy 3d performance + 0SS drivers
    2. ATI with good (relatively) performance with binary drivers that are still incomplete/buggy
    3. NV with mediocre performance but features like VDPAU in the solid binary drivers.

    I suspect NVidia would probably win the majority of the marketshare based on the 3 options above. The problem is that in Windows the drivers are fairly even and NV just can't match up with the hardware.

  4. #24
    Join Date
    Aug 2008
    Posts
    39

    Default

    Quote Originally Posted by GT220 View Post
    So why do you still use Nvidia then? Since you want to be an ATI fanboi so much, go and buy their DX11 card now. I'm sure you'll enjoy using their Linux drivers.

    Failing GPUs are only the old 80nm parts like 8600(G84) and 8400(G86) which are long discontinued, and you believe what the biggest ATI asskisser Charlie writes? LOLOL, dumber than dumb indeed you are. Most of Charlie's BS lies were debunked at Beyond3d forums.

    Loss of chipset market is irrelevant, Nvidia is replacing that with revenue from the Tegra SoC, which will be far better. Intel is well known to be greedy, wanting the chipset market cake and eating it too. VIA should know that, they got driven out by Intel's greed.
    Oh I know about Charlie, Like anyone his word should be taken with a grain of salt, but it doesn't change the fact that even he is right on the money sometimes. As for Beyond3D, I wouldn't know, they aren't one of my usual haunts. I perfer HardOCP, Guru3D, MadShrimps, Jonny Guru, Xtreme Systems and OCF for my tech info.

    Loss of the chipset market is by no means irrelevant, especially since it only leaves you with a delayed series of GPU, and the hope that the next gen Gameboy doesn't flop, not everything Nintendo touches is gold, remember the that the N64 did nowhere near what was expected and that the Virtual Boy is the bastard stepchild they keep locked under the basement stairs. Lets also not forget that the 800Lb gorilla in the room, Intel, is looking to take the GPU market by releasing their own GPU for the low to mid range, I wouldn't expect the Larabee to take the high end till at least it's 3rd generation, but Intel having a GPU in the market that is anything better then an IGP means that they will be able to further leverage themselves till they force everyone else out.

    VIA was driven out by failing at just about everything they did since they made S3TC. Though thats not to say with a 3rd party chipset and GPU that a scaled up Nano couldn't get them back into the low end desktop market.

    Who says I'm not planning on going with ATI/AMD on my next build? I'm looking for a buyer for this box first.

    Quote Originally Posted by GT220 View Post
    Since you want to be an ATI fanboi so much, go and buy their DX11 card now. I'm sure you'll enjoy using their Linux drivers.

    LOLOL, dumber than dumb indeed you are.
    So...Does this mean I win by default or something? Oh and before I forget http://wiki.x.org/wiki/RadeonFeature Just waiting on the prerequisites to fall into place and the driver should come together quite quickly.

  5. #25
    Join Date
    May 2009
    Posts
    69

    Default

    This card has 48 cores??? Shouldn't it be getting higher benchmarks?
    Maybe the drivers or benchmarking software are not taking advantage of this? Not a bad price for a new low end card.
    I have a GTX 285 now and I'm pretty sure I'm not using both cores in Jaunty Jackalope with the 185.18.36 driver.
    Can't wait to see what the Nvidia open source drivers can do!

  6. #26
    Join Date
    Oct 2007
    Location
    Under the bridge
    Posts
    2,099

    Default

    Nvidia is starting to look worryingly similar to 3dfx when they tried to release the Voodoo 6000 (albeit vastly larger and more efficient). I'm sure some people are rubbing their hands with glee now - Nvidia acquired a lot of bad karma and ill-will when they bought 3dfx...

    Quote Originally Posted by smitty3268 View Post
    Quote Originally Posted by GT220
    GT 220 is competitive enough. Even on Windows ATI sucks badly in the driver dept, Nvidia's DXVA decoding support and compatability is far superior to ATI's. There's plenty of H.264 encodes that ATI cannot decode on their shitty UVD/UVD2, that the Nvidia first generation VP2 can do with the latest Nvidia drivers.
    Proof? I haven't had any problems... The best thing the 220 has going for it on windows is the brand name - lots of people will buy it just because it says NVIDIA on the box, without looking at what they're actually getting.
    Come on, why are you even paying attention to this shill? He registered just to advertise the GT220, you won't find any truth to his words that's not twisted to hell and back.

    For example this "shitty drivers on Windows" argument. He forgot to mention that Nvidia drivers were responsible for 3x times as much crashes as Ati drivers in 2007 (28.8% vs 9.3%). (If you have any more recent statistics, I'd love to hear them!)

    My personal experience is that NV drivers were beyond shitty on Vista for over a year after its release (Ati drivers worked pretty much fine since Vista RC1). Even now, multi-monitor support is a crapshoot with Nvidia's drivers forgetting or setting the wrong resolution time after time (Ati's drivers work correctly in the same setup).

    Quote Originally Posted by cliff
    This card has 48 cores??? Shouldn't it be getting higher benchmarks?
    Simply put, no. That's why you shouldn't listen to anything that comes out of a shill's mouth without a huge barrel of salt (1024 threads, oh wow!)

    This is a low end card with low end performance. For comparison, Ati's flagship (the 5870) has 1600 cores, each one more capable than this card's paltry 48. Of course, this doesn't say the whole story either: the 5870 uses GDDR5 memory (vs GDDR2 or 3 for this card) and its architecture is vastly different, both more efficient and more capable (shader model 5 vs 4.1).

    As I said, you won't find any truth in a shill's words that's not twisted beyond any recognition.

    The bottom line is that this card is a marginal improvement over Nvidia's previous low-end hardware. It's mostly an attempt to increase margins for Nvidia and tide them over to the GT3xx release. There's no point in upgrading if you are already using a 8600/9500/9600 and there's little point in putting this card into an HTPC as long as it has a fan.

    I'd only recommend this card if (a) you can find it fanless and (b) you want to build a low-powered Linux box. In all other cases, there are better choices than the GT220:
    - Nvidia's 9500 for a fanless, low-power Linux box.
    - Ati's 4670 for a fanless, low-power Windows box that can also play some games.
    - Ati's 5xx0 series for more serious gaming.

    Edit: spelling mistakes...
    Last edited by BlackStar; 10-20-2009 at 04:14 AM.

  7. #27
    Join Date
    Aug 2008
    Location
    Netherlands
    Posts
    225

    Default

    In about 2,3 months nvidia knows exactly what the marked share for lowend videocards are on linux. No informed window user would buy this card.
    Last edited by tmpdir; 10-20-2009 at 05:31 AM.

  8. #28
    Join Date
    Jul 2009
    Posts
    88

    Default



    The ATI result look like V-Sync.

  9. #29
    Join Date
    Jul 2008
    Location
    Germany
    Posts
    558

    Default

    Quote Originally Posted by GT220 View Post
    GT 220 is competitive enough. Even on Windows ATI sucks badly in the driver dept, Nvidia's DXVA decoding support and compatability is far superior to ATI's. There's plenty of H.264 encodes that ATI cannot decode on their shitty UVD/UVD2, that the Nvidia first generation VP2 can do with the latest Nvidia drivers.
    You are an cute Fanboy. On my System i have no problems with DXVA and ATI with h.264 and VC-1. Or its posibile that i make something wrong because its work?

  10. #30
    Join Date
    Sep 2007
    Posts
    122

    Default

    With AMD's Windows drivers, H.264 decoding via DXVA is restricted to level 4.1, while NVidia's drivers can decode up to level 5.0 (like VDPAU). I guess this is just a software issue though.

    It's clear that AMD has the superior hardware at the moment, but with the crappy driver situation it simply is a no-go.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •