Results 1 to 8 of 8

Thread: Wrong frequencies for HD2900PRO?

  1. #1
    Join Date
    Dec 2007
    Location
    Merida
    Posts
    1,114

    Default Wrong frequencies for HD2900PRO?

    A question that has me baffled. When I was running a HD3850 CCC reported the clocks of the card (668/825). I'm currently running a HD2900PRO, and when I check the CCC it reports clocks as 507/514. Should I assume that the card switches to its actual frequency (600/800) when I run 3D apps.? The card doesn't support Powerplay or any sort of scaling (I don't think) with the current drivers, so I don't see how it could be switching clocks like that. Where else could I look at my cards frequencies besides what the driver reports?
    Last edited by Melcar; 01-11-2008 at 03:05 AM.

  2. #2
    Join Date
    Jan 2008
    Posts
    2

    Default

    I'm wondering...
    Is your 2900pro also identified as an 2900xt. Mine is, but the clocks are reported as you stated. Also some apps have issue correctly identifying the cards available ram. I've got the 1gb dd4 version and it is often reported as having only 256MB by ETQW-demo specifically. Luckily they allow you to override the setting. CCC reports the correct amount however.

    I would like to hear more about the clock increase for opengl stuff though. Mostly so I can verify that it does indeed clock up.

  3. #3
    Join Date
    Dec 2007
    Location
    Merida
    Posts
    1,114

    Default

    Yes. The card is identified as an XT. The clocks CCC shows I think correspond to "2D clocks". The same thing with my X1950XT; 500/500 corresponded to the low power mode Windows used while in 2D (600/900 being the actual "3D clocks"). With the X1950XT aticonfig showed only one powerstate (500/500) which it labeled "Overdrive"; one of the logs (I think it was message.log) did show all the correct speeds, but no indication that frequencies were switching. With the HD2900 there are no log entries and aticonfig just complains that Powerplay is not supported. As I said, the clocks for the HD3850 are displayed correctly.
    A theory I have is that the X1950XT and HD2900 have support for frequency scaling with the driver, regardless of what aticonfig or the CCC says, and does it automatically, while the HD3850 does not have such support from the driver yet so it runs at full speed all the time. Just a theory, but it would make a little sense. I wish there was at least some kind of benchmark for graphics cards so I can at least guess as to what is happening with the frequencies.
    Still, the whole thing has me confused and annoyed, specially if it turns out that the card isn't able to perform at full speeds. No one seems to have a straight answer. Not even on the beta mailing list.

  4. #4
    Join Date
    Dec 2007
    Location
    Merida
    Posts
    1,114

    Default

    Hm. Flashed my pro to an xt. One would expect some form of performance increase in glxgears, yet I keep getting the same numbers. What makes it even more strange is that the old HD3850 I had on this PC got about 10K more in glxgears than this XT, and there is no way a stock 3850 is faster than a 2900XT.

  5. #5
    Join Date
    Dec 2007
    Location
    Merida
    Posts
    1,114

    Default

    No one has any insights on this? I would really like to know if the driver does indeed ramp up the clocks. I see no mention of anything that would indicate such a thing on the system logs.

  6. #6
    Join Date
    Dec 2007
    Location
    Merida
    Posts
    1,114

    Default

    No one has any idea? I just want some form of feedback on this.

  7. #7
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,575

    Default

    I'll add this question to our "ask ATI" list if it's not already there.

    While I'm at it, a quick request : if anyone feels the urge to reflash their BIOS *please* be sure to reflash with an image from a card which has the same connectors and connector/GPU wiring, otherwise as you add monitors and do more display-related things down the road (hot plugging etc...) you and the driver are both going to be very confused.

    And somehow it always ends up as my fault
    Last edited by bridgman; 01-15-2008 at 10:07 PM.

  8. #8
    Join Date
    Dec 2007
    Location
    Merida
    Posts
    1,114

    Default

    Quote Originally Posted by bridgman View Post
    I'll add this question to our "ask ATI" list if it's not already there.

    ...

    It should be there already. Been asking this all over the place in the hopes of some kind of feedback.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •