No. AMD has just dropped all support for those chips, on both Windows and Linux. This is not related to r300g - you don't even have r300g on Windows. The r300g is maintained by the Mesa people, not by the Catalyst crew.
That's pretty liberal use of the word maintain. r300g is set. This card IS 100%. The only missing benefit is an extra 20-60% in general performance.
R300 support isn't just complete.
"ArtX paved the way for the development of ATI's R300 graphics processor (Radeon 9700) released in 2002 which formed the basis of ATI's consumer and professional products for three years afterward."
Dolphin, a completely separate emulation project, can run damn near %90 of all gamecube games. The flipper chipset is nearly the same as the R300.
Of all the graphics chipsets there are open source drivers for, r300 is the closest to matching quality with the proprietary drivers. All of this, with no cease and desist letters? Support from people that *just happen* to know what they're doing? AMD hasn't dropped support for sht. They just don't want your phone calls.
Extremely lucky? Loosing video performance and proper sleep/suspend for USB audio support isn't what I would call lucky.
Only someone with old hardware? You might as well say, anyone who doesn't own a new computer. Stop there.
I've seen the changelogs at kernel newbies, I've seen the articles on this website. Only "someone who isn't discerning enough to know better", upgrades just because of a version number. That's just blind faith.
What is exactly is this huge advantage? Some incomplete filesystem? Broken ACPI? Lower FPS average? Is that why people should upgrade? Even Linus himself, has doubts about the newer kernels. Notice how everything is on a support/fix/break cycle?
I have a theory and feel free to prove me wrong. Desktop users, don't need any kernel newer than 2.6.33.
Give me something practical. Really practical. Ext4 scaling to 48 cpus is nice, for example. Personally I'd rather have better/faster graphics. But USB audio support? pffft. That's weak. Weaker than weak. Whatever.
How do you know it's the kernel causing FPS issues? To me it seems like newer versions of X have gotten in the way more than the kernel. They changed it for whatever reason and broke driver features, and it seems like they'll get resolved after Hurd is completed.
I don't know if the Intel devs are still reading this thread, but the main reason why I have been paying the premium for Intel products so far has been power efficiency. I can justify the increased cost of Intel over AMD knowing that, in the long run, the power savings from lower idle power usage and more efficient processing under load will more than pay for the premium Intel chip. Viewed in this way, I pay less to own better performing Intel chips over their AMD counterparts.
I am anxiously awaiting the release of Ivy Bridge and its newer, more efficient 3D transistors. I'm currently using a Nelahem i5-750.
Most of my system load comes from two applications: Firefox and VLC. The other things I use (Thunderbird, Geany, MonoDevelop) don't put nearly the strain on the system that graphics-heavy site browsing and HD video playback do. My current desktop isn't slow, but I can notice the heat and the fan speed when it is under load.
My upcoming Ivy Bridge system may be the first system I build without a discrete graphics card. Integrated GPU performance has increased over the years to levels where normal desktop usage is no longer hindered by onboard graphics. Gaming rigs will still see a benefit from discrete graphics cards, however the cost and power consumption will no longer be worthwhile for non-gaming use.
Top performing drivers are so critical to this goal. Using the hardware acceleration to the fullest for everything from the desktop environment to web browsing to video decoding keeps the load off of the CPU so it may remain in a lower power state.
I hope the days of Linux video drivers being second-class citizens of the driver world are finally over. It's also a bit neat to think that a team of driver developers can do so much to reduce power consumption. Hopefully these software improvements will assist in reducing the need for the construction of new power plants.
If the DE, Firefox, and VLC can all make full use of the Ivy Bridge GPU capabilities then I will be a satisfied customer.
Tight integration from the app down to the hardware is essential.
I can't wait for my apps to work well together with the drivers, not in spite of it.
I'm not sure if this second bit of feedback is too off topic, but I don't know where else I might find Intel developers in a feedback-soliciting mood. Has Intel released details of the specific functioning of the Bull Mountain RNG in the upcoming Ivy Bridge chips? Will this be a true cryptographically-secure RNG or just a high-speed PRNG implemented in hardware? If it is the real deal, then how will it be accessed? Will each app need to be coded to make the required hardware calls, or will the kernel poll the entropy pool and keep it topped up with hardware sourced bits as necessary?
I'm looking forward to this feature even more in light of the recent attacks against certificates generated with crappy entropy sources. True hardware random number generation should have been on die as soon as the vitally needed AES-NI instruction set was implemented. It's good to finally see more attention paid to security by hardware manufacturers.
Most people might not care, but I consider both of these killer features because I use full disk encryption on all my machines. Disk writes to my SSD array are currently CPU bound because of the full disk encryption. I also make heavy use of VPNs for work, OpenPGP for my email, and AES for large backups, With a heavy data volume through user content and disk writes, I am looking forward to the efficiency improvements from a hardware implementation of AES. I hope the RNG will also be up to the task.