Ex-Cyber: that's the technical issue here, but it doesn't change the fact that it's Intel's responsibility. PowerVR don't have any commitment to being a good Linux citizen, they're a hardware company, they make chips and that's more or less it.
However, Intel does have a very public commitment (which, to their credit, they mostly keep) about being a good citizen for Linux and open source in general. Which they've completely stuffed up in this case. Yes, the actual problem is that the hardware is something completely new, but it was Intel's decision to sub-license this hardware and release it - under the 'GMA' product line, no less - without, apparently, giving any thought to the ramifications for Linux and open source support.
From what I know, it seems there was basically a lack of communication within Intel between the side responsible for hardware decisions like this and the 'open source' folks. The open source side was not in a position to provide feedback to the hardware side about the problems this chip would present, and so it just went ahead regardless, and now the Intel open source folks are stuck playing catch-up. Bad organization.
Couple things here. I don't recall Intel using PowerVR in any previous graphics chipsets, but I think they were used for an earlier server system. I may be wrong. Intel's graphics chipsets have always been based on the technology they received when they bought Chips and Technologies, Inc. back in the mid 90's, their first product being the I740 graphics card. And, yes, it sucked bad.
As to the GMA500, the group that designed that were looking for small scale, low power and low cost. Intel's graphics team are more focused on the desktop and laptop markets, which has a higher budget for all three factors. By using the PowerVR IP, they were able to take some of the best technologies available and combine then into a single chip that is roughly the size of a dime. This single chip has USB, PATA IDE (SATA actually would have increased the footprint), SSD, PCIe, a scaled down HD Audio bus (only supports two codec streams as opposed to 4 on the ICH), and a few other bells and whistles. The low power consumption is phenominal compared to the GMA 900 series chipset.
As to the driver, Intel's driver development team had their hands full rewriting the 900 series driver, so it was outsourced to Tungsten Graphics, the people that did the initial 900 series driver that everyone loved (remember the i915bios program to modify the video shadow bios for non-standard graphics modes like 1280x800?). Only problem was Tungsten had either let go or reassigned their Linux developers, so they had to pull two engineers form the Vista development team to support the Linux driver (yes, they did Vista support for Poulsbo - imagine the pain). This new team based their work off of the kernel that shipped with Fedora Core 6, even though FC9 was available, and the official development platforms were Ubuntu Gutsy and Moblin.org. Every release had to be updated with minor patches just to build for Gutsy. And all patches had to be approved by PowerVR before being released. The latest driver is so far off from the new DRM kernel module core that it will take a top-down rewrite to make it work, let alone see any improvement. Plus with the licensing, I'm not sure that Intel's developers have access to the hardware documentation (pure speculation). Makes for a development nightmare all around.
I did get a chance to see one system in action. It was playing HD quality video (1080i samples downloaded from Microsoft) with very remarkable quality. And the cpu performance was minimal according to top running through ssh on another system. CPU utilization was on average below 10% while the video was playing. And there was virtually no heat coming off either the Atom or the Poulsbo (neither of which had a heatsink during this demonstration).
My personal feeling is that if Intel plans to go forward with their push into extreme mobile (MIDs, cell phones, etc), they either need to find another source of video IP, or buy out PowerVR. But with today's economy, I don't think they're ready to spend any $$$.
I had bought a Dell Mini 12 with Ubuntu. After receiving it, it turns out it didn't run standard Ubuntu and Ubuntu wouldn't even manage to display its graphical isntaller. I returned the thing for a refund, and my company will not be buying any of those devices; we'll be looking for other 12" netbooks, maybe based on Via or AMD.
Ow... that just completely crushed my enthusiasm for getting a Pandora. My eee may be technically inferior, but at least I know it's supported...
Originally Posted by HyperDrive
First, to keep hopes from getting too high, there is not any work on this, nor is any work planned. That said...
My employer, Oregon State University, is deploying a series of handhelds, called the OSWALD, which has a similar chipset (GMA 535, same as Beagle Board.) We have discussed the matter, and if we take care of the basic obligations, such as distro management and package generation, and get all the other chips up and running in a timely manner, we can talk to the legal advisors and see whether or not REing the Texas Instrument blob for that chipset would be something that the university could fund.
This is all still theoretical, of course. But it is possible.
This thing certainly hurts established "Everything Intel Just Works (TM)" mentality inside Linux community. Sometimes it is incredible how short-sighted moves can be done by such huge companies
I think the Sega Dreamcast uses a PowerVR Series 2 chipset whereas the PowerVR SGX 535 used by intel is a Series 5 chipset. I don't know how similar they are.
Originally Posted by Ex-Cyber
Oh it seems that the palm pre, and nokia's successor to the n810 are also expected to have PowerVR SGX cores.
Maybe if there is enough interested and enough developers here we can form a group to create fully free drivers for the hardware? It would probably help if there were people willing who have done X.org driver development and someone who was familiar with clean room reverse engineering.
Since this driver seems to be in such poor shape, perhaps they should cease work on it and instead start from scratch on a Gallium3D driver.
Unless off course the idea from MostAwesemeDude pans out.
Still, the more Gallium3D drivers we have, the less driver maintenance needed in the future.