A question that has me baffled. When I was running a HD3850 CCC reported the clocks of the card (668/825). I'm currently running a HD2900PRO, and when I check the CCC it reports clocks as 507/514. Should I assume that the card switches to its actual frequency (600/800) when I run 3D apps.? The card doesn't support Powerplay or any sort of scaling (I don't think) with the current drivers, so I don't see how it could be switching clocks like that. Where else could I look at my cards frequencies besides what the driver reports?
Is your 2900pro also identified as an 2900xt. Mine is, but the clocks are reported as you stated. Also some apps have issue correctly identifying the cards available ram. I've got the 1gb dd4 version and it is often reported as having only 256MB by ETQW-demo specifically. Luckily they allow you to override the setting. CCC reports the correct amount however.
I would like to hear more about the clock increase for opengl stuff though. Mostly so I can verify that it does indeed clock up.
Yes. The card is identified as an XT. The clocks CCC shows I think correspond to "2D clocks". The same thing with my X1950XT; 500/500 corresponded to the low power mode Windows used while in 2D (600/900 being the actual "3D clocks"). With the X1950XT aticonfig showed only one powerstate (500/500) which it labeled "Overdrive"; one of the logs (I think it was message.log) did show all the correct speeds, but no indication that frequencies were switching. With the HD2900 there are no log entries and aticonfig just complains that Powerplay is not supported. As I said, the clocks for the HD3850 are displayed correctly.
A theory I have is that the X1950XT and HD2900 have support for frequency scaling with the driver, regardless of what aticonfig or the CCC says, and does it automatically, while the HD3850 does not have such support from the driver yet so it runs at full speed all the time. Just a theory, but it would make a little sense. I wish there was at least some kind of benchmark for graphics cards so I can at least guess as to what is happening with the frequencies.
Still, the whole thing has me confused and annoyed, specially if it turns out that the card isn't able to perform at full speeds. No one seems to have a straight answer. Not even on the beta mailing list.
Hm. Flashed my pro to an xt. One would expect some form of performance increase in glxgears, yet I keep getting the same numbers. What makes it even more strange is that the old HD3850 I had on this PC got about 10K more in glxgears than this XT, and there is no way a stock 3850 is faster than a 2900XT.
I'll add this question to our "ask ATI" list if it's not already there.
While I'm at it, a quick request : if anyone feels the urge to reflash their BIOS *please* be sure to reflash with an image from a card which has the same connectors and connector/GPU wiring, otherwise as you add monitors and do more display-related things down the road (hot plugging etc...) you and the driver are both going to be very confused.