Most people go through 5 years per video card.
The median from the stats I've seen is about 2-3 years, yes. Five times a year is ridiculous. Most people upgrade their entire machine every 2-3 years though, not just the graphics card, and most of those are upgraded too soon; people mistake screwed up Windows installs and trojan-infested machines as being slow and out of date instead of just needing a cleaning. Likewise, a lot of people's computers break down or start acting up because they don't clean out the dust build up in the internal components and things start to overheat. If people learned that a computer needs regular maintenance just like a car does, they'd probably be able to squeeze a good 5 years out of a machine, even if they do median-levels of gaming.
Replacing a 2900XT with a 5670 makes sense for a gamer. That's about a 3 generation gap, and the 2900XT is starting to hit the end of its ability to play new games on high settings. My HD4770 should last me another 2 years, although I may upgrade sooner _as a developer_ (and not as a consumer) to be able to goof around with DX11/GL4 in my spare time.
In my experience, even as a hardcore gamer, replacing a video card once a year is sufficient. And never, ever spend more than $150 for a GPU. Those $400 beasts cost 3x-4x as much but only last 2x as long, as the mainstream $100 card will play all the new games just as well as the beast. You're better off spending $100 every year than $400 every two.
Remember, those $400 GPUs and $1000 CPUs are NOT MEANT FOR REAL PEOPLE. Uber nerds who live in their parents' basement and don't date anybody buy those things. Benchmark sites and magazines buy those things. Suckers who get talked into it by uber nerds who don't really know any better buy those things. Nobody else does. And Intel/AMD/NVIDIA know it. Those parts are there for subliminal advertising. Vendors put out the $700 5970 because then Joe Schmoe sees the benchmarks, thinks "wow ATI+Diamond makes the fastest video card!" and then when he goes to the store to buy a $100 card he can actually afford he reaches for an ATI instead of an NVIDIA, despite the fact that the $100 NVIDIA offerings might be faster than the $100 ATI parts (or not). Same goes for the EE/FX series CPUs Intel and AMD put out. People buy $150 Intel CPUs because the $1000 Intel CPU beats the living hell out of the fastest AMD CPUs by a huge margin, even though at those low price points the Phenom IIs are a better deal. Those big cards and their benchmarks are for suckers, not intelligent consumers.
Your last paragraph, although has some point, is really just 'sore grape' attitude.
Without high end R&D, there will be no mainstream derivatives. It is the development of the high end burns most of the development cash so we 'average joe' could afford some shaped down new technology.
Also you are way off topic. What are we discussing about? Oh R800 OSS support. My point still is: by the time R800 OSS support comes out R800 would be outdated. No matter you are buying a 5970 or a 5430, they are both the same.
I am just trying to see the OSS support timing problem from a subjective point of view, not some 'I bought it, so it must be good' way. Outdated is outdated, no matter how hard you try to cover it up. You can still use it yes, but it is outdated. lol
Where do you see the cutoff, ie "if Evergreen OSS support comes out after <month>/<year> the Evergreen parts would be outdated" ?
I have got a HD 5670 and it is too slow for DX11/OpenGL4 with tesselation. Only without tesselation and only 1280x1024 res it has more than 25 fps with Unigine Heaven. That's everything but a card for gamers.
I didn't buy an Evergreen card to have the latest and greatest, and frankly I couldn't care less about OpenGL 4. For me they were simply the first cards where you didn't have to buy the lowest end of the spectrum to get decent power consumption. My card probably spends 95% of the time just displaying a normal Desktop and with Evergreen it looks like they finally got a clue about power consumption.
so your Point fails complete... the biggest card for the opensource driver is the 5870!
and hey.. the 5870 is the fanboy edition card ;-)
the hd5670 is better because more fps per WATT!! than the hd4000 or hd3000 or nvidia gtx400 !!!
and hey Catalyst 10-8 will speed up tessellation for 10-20% ;-)
and the hd5670 is the better card because of the very good openCL support ;-)
thats because hd4000 emulate 1 gpu cache in vram very slow..
and hd3000 do not support opencL..
but hey,,, kano we all know your LOVE to amd ;-)
PLEASE buy an nvidia GTX485X2 super duber and be happy