I submitted about 25 different and brand new laptops, but there is no way of knowing if the submit was successful or how fare it has gone in the process. After weeks of waiting and multiple correspondences with the maintainers still none of the laptops had ended up on Ubuntu friendly and i just gave up on the whole thing.
And it provides a composite index based upon massive amounts of data - http://openbenchmarking.org/index though no index based on power use since there's many factors and not enough power usage tests done by the community.
I think they need to add a hardware feedback tool and also enhance software center compatibility feedback. The hardware compatibility tool could just collect your hardware configuration at install, run a quick array of basic tests, and then ask if you want to submit it anonymously. Sorta like the "Windows Experience Index" does by benching your system at install, but instead of listing some relatively meaningless scores, just determine if your config passed. At least it would be more informative to a new user troubleshooting the install.
As for software center. The feedback tool is lacking. Rather than just providing a few lines for comments and a star rating, have reviewers say which version of Ubuntu they are running and how well the app runs for them. The feedback could help determine what apps are functional, and which ones are deprecated and should get canned or hidden from SC based on which release you are running. Allow users to review third party apps, and, if feedback is favorable, consider adding it to the SC by default.
AIUI, Smolt isn't being decommissioned, it's being rewritten without the single interface. The core smolt will essentially be a specialized database and an interface; you write your own way to interface with it. So we don't just have one UI which decidedly doesn't fit all, you can write multiple tools for submitting hardware, different tools for querying the data, and so on.
Of course, it has to actually _happen_ first.
Every crowdsourced HCL type effort like this that I've seen has failed. I'm not sure it's really possible to make a good one. There are several virtually insurmountable problems:
* The data get stale really, really quickly
* One person's 'works' is another person's 'barely functional'
* There's _so much_ damn hardware out there
Point one - so you ship WhizzyLinux 1.0 with a bug that makes systems with SuperGraphics 5000 cards crash every hour. Then three months later you ship an update that fixes it. Does WhizzyLinux 1.0 work with SuperGraphics 5000 cards? What do you think the answer provided by your crowdsourced HCL will be?
Point two - WhizzyLinux 1.0 renders the desktop just fine on SuperGraphics 6000 cards. But It doesn't have any 3D support. Except with the proprietary Super driver, of course. Is that 'works'? 'Works somewhat'? 'Doesn't work'? 6, on a scale of 1 to 10? 9? 2? Remember, you probably have the same generic scale for every type of hardware! If you don't, you have to pay someone to sit there coming up with ever more specific categorizations of 'works' and 'doesn't work'!
Point three - so HP make this laptop, right. You can order it to come with any one of three processors, four graphics cards, two screens, five hard disks, three sound cards, and two wireless adapters. Then there's the Canadian version, the French version, the Japanese version, the U.K. version, the French Canadian version. And that's just one model! In the 'budget home office' range! Not the 'mid-price home' range, or the 'high-end enterprise' range! Or the specialist gaming range! Or the second label!
Even if you have the biggest user base out there, the info you get is going to be an absolute drop in the ocean compared to the sheer number of configurations available. The chances of any one user going to your crowdsourced HCL and finding a report from someone who has their _precise_ same system, running the precise same version of your distribution that that user wants to run, are pretty small, outside of some really established ranges of hardware like Thinkpads and Macs. But you don't need a crowdsourced HCL for Thinkpads and Macs. There's dedicated sites out there for them. The only scenarios crowdsourced HCLs are capable of covering very well are scenarios for which they are very little use.
So, Mandriva's crowdsourced HCL petered out and was never much use. Canonical's, apparently, did the same. The others I recall went the same way. Smolt hasn't, exactly, but that's because it isn't really just an HCL. It has device and system ratings, sure. As an HCL it's terrible, probably worse than the others. I don't know of anyone who uses it as one. What it's somewhat useful for is as a source of statistical information; we use it to answer questions like 'crap, it seems this specific network card is broken; just how many people seem to be using this specific network card model'? For limited scenarios like that, it's fine. But yeah, seems to me, if you try to set up a friendly, user-facing crowdsourced HCL, you're going to wind up with a messy failure.
Maybe you could use OpenBenchmark to list most popular brands, most popular models, etc.
Which models that perform best, use least power, and works good.
Mark Shuttleworth uses a Dell.
Linus Torvalds uses a MacBook Air.
Richard Stallman uses a Lemote (Chinese, MIPS-based)
I wonder what other developers use.
A lot of people at Red Hat use Thinkpads, because that's what you get as a corporate laptop by default. And they're pretty nice systems. There are exceptions, though. Jesse Keating uses a Mac. I run a Vaio Z. I know that back in 2009, Jim Whitehurst had an Acer Aspire One (in addition to a few other boxes, obviously), but I've no idea what he's got now. =)