An accelerator could be written using shaders like any GPGPU application but noone's got a working implementation, only a few unfinished summer of code projects. All that could seem busy trying to make 3D acceleration work instead.
General consensus is that implementing shader-based video decode over Gallium3D makes more sense than doing a hardware-specific implementation for every different GPU family. The current focus on 3D work reflects that :
Classic 3D => Gallium3D => video decode over Gallium3D
One thing I haven't had time to check is whether XvMC can be useful for H.264 and VC-1 if only the MC portion of the API were used (ie not IDCT, where the MPEG-2 coefficients definitely won't work). My guess is that XvMC in its current form simply won't fit with the newer video formats, so there's a strong argument for implementing a Gallium3D backend directly in the existing decoder stacks.
What about dropping all the packages for Jaunty and older Ubuntu from xorg-edgers PPA? Or you want to keep them for history purposes?
Yes, we have been thinking about it. But if someone is happily using the last updates we pushed for Hardy or Intrepid, it does not hurt to keep it there. Personally I used the Hardy packages really long on my work machine with r500
With 99%/50%, that means you've got a single thread pegged for decoding 1080p. Since there's no hardware H.264/VC-1 decode acceleration for ATI under Linux yet, if you have a multi-core CPU I'd suggest hitting up mplayerhq.hu for the multithreaded mplayer. I ran into the same problem... if the 720p is smooth, that's almost certainly your problem, and not Xv.
I missed you post earlier. Will try it out next week and let you know.