I'm currently running an HD 3200 on a 780G motherboard with MythBuntu 9.04 64-bit to power a MythTV setup (1024x768 output via DVI->HDMI cable since I was out of HDMI cables). CPU is an X2 5000+ with 4GB of DDR2 800.
I'm currently using mesa git master, agd5f's r6xx-r7xx-3d drm branch, and a copy of xf86-video-ati from git master. I've specified Option AccelMethod "EXA" in my xorg.conf, but other than that, I've got a mostly blank xorg.conf.
I've got an ATSC tuner in the machine, and when watching live TV (MPEG2?), I've been noticing that one of my cores is pegged around 85-99% at all times when watching TV. The process sucking up all of the cycles is X itself, with mythfrontend occasionally popping up to steal a few cycles.
My main reason for posting right now is curiosity. Is there any reason for me to believe that a switch to using KMS/DRI2 would decrease my CPU utilization, or does anyone happen to have experience with a similar situation (high playback cpu utilization) and have recommendations to decrease the system load?
I do understand that eventually there will probably be shader-based decoders (and XvMC) sitting on top of gallium on top of r600g (which I am definitely looking forward to), and while my CPU is running flat out it's still watchable, but I am just curious if there's an easy fix right now to give my CPU a little more breathing room.
You were right about the vsync issue. I turned off vsync using xattr and that removed all of the X cpu utilization (averaged < 5%, if even that), and only the mythfrontend process was really doing decoding work.
The tearing was somewhat noticeable without vsync, so I'll probably leave it on until the driver picks up the changes to use interrupts for vsync in the future.