Shaders vs. ASIC
For me it's an intermediate solution to use the "all-purpose" shaders for that and keep them busy instead of an ASIC that was made for the very purpose. The ASIC will always be better and especially with far lower power consumption (probably even mW area).
And: Some flamers here seem to have way too much time.
Nvidia? They don't support free drivers at all. They just "won't sue nouveau".
intel: Not much GPU power here, driver so-so, but do they actually have any specific video decode unit, something that also touches content decryption of "commercial media"?
Matrox, VIA, SiS etc. : lol.
Imagination Technologies: ROFL.
No enterprise can just easily give out specs that touch these things without some review. Otherwise they could be gripped by legal means quite easily e.g. with sales stop/prohibition.
And most of them don't publish anything at all.
I'd be really welcoming any UVD functionality for sure, but I accept that this isn't an easy topic and needs some careful handling first.
So +1 from me for archibald.
And these people flaming AMD-ATI, regardless of me being a fan of them, think about it and see if other GPU vendors are any better in this regard. Hint: They aren't. So please stop flaming AMD for something that is clearly NOT their fault. Flame Disney, GEMA, VG Wort, MPAA, RIAA, Warner, Universal and these folks for making devices more expensive than necessary and not fully usable in a free OS environment.
Stop TCPA, stupid software patents and corrupt politicians!