Page 5 of 11 FirstFirst ... 34567 ... LastLast
Results 41 to 50 of 102

Thread: Catalyst 10.1 and Xorg 7.5 / 1.7.x?

  1. #41
    Join Date
    Jul 2008
    Location
    Greece
    Posts
    3,801

    Default

    Lowering clock speed in Catalyst for Windows and keeping the fan speed constant doesn't really offer much improvement. Maybe 2 or 3 C but that's it. Lowering voltage is what gives 15 C lower temps. We're talking 90C vs 75C here.

    Maybe the effect is not as high with other cards, I don't know, but the 4870 gets *HOT*. It uses AMD's reference cooler, btw.

  2. #42
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,578

    Default

    Before you lower voltage you normally need to drop engine and memory clocks, so you would presumably be seeing the accumulated benefit from all of those changes.

    Just to be clear, I'm not saying that lowering clocks alone is all you ever need, just disagreeing with the statement that lowering clocks alone makes no difference or is not worth doing. Lowering clocks is what the driver can do today, and if your GPU is running too hot you should be doing it even if *additional* power saving functionality can be added to the driver in the future.

  3. #43
    Join Date
    Nov 2008
    Location
    Germany
    Posts
    5,411

    Default

    Quote Originally Posted by Darksurf View Post
    Forgive me, but have you tried using Mesa 7.7 + Xorg 7.5 + 2.6.32 kernel + xf86-video-ati-9999 ? I'm getting great results. I'm using Sabayon Linux (gentoo based) have have added the x11 overlay and have been using the xf86-video-ati-9999 ebuild to stay on top of the git, and have found this combination to work better than I would have ever imagined! I'm sure the 2.6.23/24 kernel will be even better! Its crazy how well the drivers have progressed and cleaned up over time! I'm not sure why the last release is 6.12.4 though considering that its kinda an old release by now and they have a far better working version in git right now... I'm not ever going back to the fglrx drivers now.
    (working on X200m (better than windoze),HD2600,HD3300,HD3870,HD4650) yeah, I know. I'm an ATI fanboy with too many ATI cards (and computers) and no Nvidia cards(except for my PS3 but that doesn't count).
    jihhhaaa :-) PS3 shame on you ...

  4. #44
    Join Date
    Jul 2008
    Location
    Greece
    Posts
    3,801

    Default

    Quote Originally Posted by bridgman View Post
    Just to be clear, I'm not saying that lowering clocks alone is all you ever need, just disagreeing with the statement that lowering clocks alone makes no difference or is not worth doing. Lowering clocks is what the driver can do today, and if your GPU is running too hot you should be doing it even if *additional* power saving functionality can be added to the driver in the future.
    My point is that the OSS driver is not good for my card because of this and why I keep using Catalyst. The OSS drivers don't do the stuff AMD's/ATI's hardware designers intended a driver to do. Correct driver-level power management is an important part. I do not want the card to be useless in a year from now. I intent it to be used in older machines when I get a new card at some point. I do not trust the OSS drivers at this point; I believe they will result in hardware damage in the long run unless they implement power management correctly.

  5. #45
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,578

    Default

    No question, the current power management code imposes a performance penalty in exchange for reducing power. The point here is that some power saving features are available today, and advising someone not to use those features and saying they don't make any difference just because you don't see a big change on *your* system is not helping anyone.

  6. #46
    Join Date
    Jul 2008
    Location
    Greece
    Posts
    3,801

    Default

    No advisories given by me on whether to enable low power mode or not. Just pointed out that the power management in the OSS drivers is lacking big time and if that's important to someone they should stick to fglrx for now.

  7. #47
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,578

    Default

    Quote Originally Posted by RealNC View Post
    It doesn't do anything else than downclocking (last time I checked). Voltages stay at default values. Since only voltages matter with temps, downclocking doesn't do anything helpful. Temps stay the same if the voltage doesn't go down, no matter how much you downclock the card.
    Perhaps I misread your comment above, but it seemed to me you were saying that the ForceLowPowerMode option would not have any affect on temps.

  8. #48
    Join Date
    Jul 2008
    Location
    Greece
    Posts
    3,801

    Default

    It's like a dude I know here who tries to sell me stuff with 1% rebate compared to other stores. Yes, 1% is a number, though a bit insignificant.

  9. #49
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,578

    Default

    Sure, so is zero, but that's not the point. The ForceLowPowerMode option usually makes a useful difference in power consumption and GPU temps, a lot more than 1%, even if you are measuring in Kelvin rather than Celsius degrees.

  10. #50
    Join Date
    Jun 2009
    Posts
    23

    Default

    Quote Originally Posted by RealNC View Post
    My point is that the OSS driver is not good for my card because of this and why I keep using Catalyst. The OSS drivers don't do the stuff AMD's/ATI's hardware designers intended a driver to do. Correct driver-level power management is an important part. I do not want the card to be useless in a year from now. I intent it to be used in older machines when I get a new card at some point. I do not trust the OSS drivers at this point; I believe they will result in hardware damage in the long run unless they implement power management correctly.
    Don't take this the wrong way, but have you payed much attention to the temp capabilities of AMD and ATI hardware? Hell, the Turion64 single core cpu was allowed to get up to 80C without thought, and was set to power off at 90C, AMD and ATI have always ran a little hot, but thats due to lots of power in a little area .

    Its not like they didn't expect the gpu to get that hot. Try playing high-end video games for hours, they didn't make this gpu expecting you to not put a load on it, or constantly run in it in performance mode. Otherwise you would be running an X200m lol.


Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •