Page 1 of 9 123 ... LastLast
Results 1 to 10 of 87

Thread: NVIDIA GeForce GT 220

  1. #1
    Join Date
    Jan 2007
    Posts
    14,309

    Default NVIDIA GeForce GT 220

    Phoronix: NVIDIA GeForce GT 220

    Days prior to AMD's release of the ATI Radeon HD 5750 and Radeon HD 5770 graphics cards, NVIDIA released their GeForce G 210 and GeForce GT 220 graphics cards. Both of these NVIDIA graphics cards are for low-end desktop systems, but part of what makes them interesting is that they are the first NVIDIA GPUs built upon a TSMC 40nm process. To Linux users these graphics cards are also interesting in that they fully support all of the current features of VDPAU for Linux video decoding, including MPEG-4 support. We picked up an XFX GT220XZNF2 GeForce GT 220 1GB graphics card for this round of benchmarking on Ubuntu Linux.

    http://www.phoronix.com/vr.php?view=14273

  2. #2
    Join Date
    Sep 2007
    Posts
    158

    Default

    Crap performance + no open driver: business as usual for NVidia.

  3. #3
    Join Date
    Dec 2008
    Location
    Poland
    Posts
    117

    Default

    What do graphs on fifth site show? For me they show nothing. Scale on Y axis is wrong.

  4. #4
    Join Date
    Apr 2009
    Location
    Toronto/North Bay Canada
    Posts
    877

    Default

    Quote Originally Posted by remm View Post
    Crap performance + no open driver: business as usual for NVidia.
    Crap performance? This is juz a bottom end card. Look at the other Nvidia solutions that are compared to it.

    8400
    8500
    9500
    8600

    These are only how old? Ill give you a hand, Geforce 8 is November 8, 2006.

    Business as usual for nvidia? More like business 3 years old.

    To compare, the R700 is June 25, 2008. You are looking at what is a 2 year difference between the nvidia offerings and ATI ones. Doesnt seem like much when you are purchasing an oven but in this game it certainly does. This is just a crap offering by nvidia, that seems to be targeted towards HTPC's. I think this deals more of a blow to S3 then anything else.

  5. #5
    Join Date
    Jan 2008
    Location
    Radoboj, Croatia
    Posts
    155

    Default

    Finally an ATI vs. NVIDIA benchmark. I've been waiting for such a benchmark on Phoronix for as long as I know about Phoronix.

  6. #6

    Thumbs down


    Sorry, but you need to scale them down to 0-20% to show the difference, current scaling 0-100 shows absolutely nothing ... havent you seen that before posting?

  7. #7
    Join Date
    Jan 2009
    Location
    UK
    Posts
    331

    Default

    Low end, performance of cards 2 generations behind, 40nm process, and they STILL need a fan on it? What market segment is that possibly going to fill, the deaf HTPC user?

  8. #8
    Join Date
    Jul 2008
    Location
    Berlin, Germany
    Posts
    821

    Default

    Quote Originally Posted by vermaden View Post
    Sorry, but you need to scale them down to 0-20% to show the difference, current scaling 0-100 shows absolutely nothing ... havent you seen that before posting?
    In a way, this is correct. The difference between 5% and 10% CPU load is as insignificant as the chart shows.
    Quote Originally Posted by http://www.phoronix.com/scan.php?page=article&item=nvidia_gt_220&num=9
    Additionally, one of the advantages of this budget graphics card is its VDPAU capabilities, which includes MPEG-4 ASP support along with the features of earlier PureVideo generations.
    The benefit of this I don't quite get, and it has also left reviewers at other hardware sites scratching their heads. MPEG-4 ASP is not very computationally intensive codec, today's low-end CPUs can decode HD video in MPEG-4 ASP without problems.

  9. #9
    Join Date
    Aug 2008
    Posts
    39

    Default

    Quote Originally Posted by Ant P. View Post
    Low end, performance of cards 2 generations behind, 40nm process, and they STILL need a fan on it? What market segment is that possibly going to fill, the deaf HTPC user?
    You are aware that most people buying this card will have poorly ventilated cases right? Not to mention that card makers choose what cooler to put on these low end models, I'm sure you'll be able to find one that is fanless on any site that has a decent selection, not to mention the fact that there are always much better aftermarket coolers out there, I've seen some that can even cool an 8800GTX without a fan.

    Most of the market that will be using these are the low end desktop market, for those that want light gaming capability for WoWcrack in thier underpowered bargain basement OEM box a.k.a. Dell, HP/Compaq, Gateway, Acer/eMachines, anything that will accept a gpu that wont require them to buy a new PSU to replace the anaemic one that their junker came with.

    Quote Originally Posted by L33F3R View Post
    Crap performance? This is juz a bottom end card. Look at the other Nvidia solutions that are compared to it.

    8400
    8500
    9500
    8600

    These are only how old? Ill give you a hand, Geforce 8 is November 8, 2006.

    Business as usual for nvidia? More like business 3 years old.

    To compare, the R700 is June 25, 2008. You are looking at what is a 2 year difference between the nvidia offerings and ATI ones. Doesnt seem like much when you are purchasing an oven but in this game it certainly does. This is just a crap offering by nvidia, that seems to be targeted towards HTPC's. I think this deals more of a blow to S3 then anything else.
    Agreed, do we really need more Nvidia rebagged cards? I dunno if any of you have been following the Nvidia vs. ATI internet fight kicked off by Nvidia for complaining that ATI beatthem to the punch with D3D11 hardware while their GT300/Fermi hardware is currently so nonexistent that they had to make fake cards for a recent expo and said that a prerendered video was their new chips work.

    Sources for both companies idiocy: http://www.xbitlabs.com/news/video/d...ics_Cards.html

    http://www.tweaktown.com/news/13199/...ions/index.htm

    http://www.semiaccurate.com/2009/10/...mi-boards-gtc/

    And the Twiter fight... http://www.hardocp.com/news/2009/10/...ng_on_twitter/



    I guess it means that the rumours about them having chip yields on the GT300 at only 1.7%, shame since they've been kicked out of their chipset business, they may go belly up if they can't pull their act together, if that happens hopefully it wont be Intel that snags them up, maybe a merger between VIA and Nvidia would work, it would at least give us a decent 3 way fight in the x86 market.

  10. #10
    Join Date
    Jun 2006
    Location
    Portugal
    Posts
    521

    Default

    Quote Originally Posted by vermaden View Post
    Sorry, but you need to scale them down to 0-20% to show the difference, current scaling 0-100 shows absolutely nothing ... havent you seen that before posting?
    I agree. These kind of tests would be specially interesting to run with a crappy cpu to demonstrate the advantage of this approach, not with a core i7 monster that most people don't have.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •