This is not necessarily true for workstations. Our lab has just purchased a workstation with a decent NVIDIA graphics card for GPU computing. Most of time we are using linux on it. One reason is that there are many more scientific computing libraries running on linux, so it is much easier to run our program on linux than on windows. Furthermore, windows 7 seems to have problems on our machine: whenever it wakes up from sleep mode, it goes blue-screen. As a result, we have to disable power management if we want to use windows, which is ridiculous given the fact that there is higher power consumption for workstations than for ordinary PC's.
Originally Posted by Vorzard
I have to agree
Bought a SB i3 laptop for $400, installed Debian and the graphics simply worked. No futzing around with proprietary drivers. Video is perfect.
Hopefully this will be a kick in the butt for AMD and nVidia. FOSS support for HD8000 from the start?
Oh and I'm running KDE4. It's come a long way and just works too - after I experienced some showstopper bugs (like hung terminals) in XFCE4.
I think Arch is labeled as "Linux" on the graph however it's probably a tainted stat if it's all unnamed Linux's. Gaming on Linux doesn't usually require massive amounts of GPU unless you're using Wine. I have a moderate system but all the "native" type games (HiB and Desura) don't even began to push it. One exception maybe Xonotic but I can still play that max settings 1920x1080 resolution ~200 fps. For what we have now any Intel with HD onboard should suffice and it's cheaper than adding a dedicated video card so the trend doesn't surprise me much. With maybe the exception of AMD's apu's not being more popular. Driver issues aside of course.