Page 4 of 9 FirstFirst ... 23456 ... LastLast
Results 31 to 40 of 83

Thread: Linux 3.5 Can Massively Boost AMD Radeon Graphics

  1. #31
    Join Date
    Feb 2010
    Posts
    28

    Default

    Quote Originally Posted by Wyatt View Post
    Ah, I see. Thanks for clearing that up, though I really wish this was something that was addressed in the article in the first place.

    Thanks, I'm just speculating. My personal dream from these changes....a killer APU with openCL for my Gentoo. My Phenom II x6 eats emerge @system on a budget.

    ~Jux

    "CFLAGS=-march=native -Os -pipe -ggdb"

  2. #32
    Join Date
    Dec 2007
    Posts
    2,329

    Default

    Quote Originally Posted by crazycheese View Post
    Thats a circle, because if the driver is bad, no one will purchase gddr stacked card. Something should come first, I presume from development side.
    Those cards were not affected by these commits. There are certainly more tweaks that can enabled to improve performance on them as I mentioned in another reply, but in this case these commits will not affect them. Also, gddr cards will always perform better than ddr cards, so if you want better performance (regardless of driver improvements) from a 5450 or 6450, get the gddr version.

  3. #33
    Join Date
    Nov 2009
    Location
    Italy
    Posts
    907

    Default

    Quote Originally Posted by evolution View Post
    Furthermore, I'm also expecting that the "default" profile will stop using always the maximum frequency of the GPU, because that kills the GPU lifetime. (Nouveau does the opposite, btw...)
    That's a fucking stupid thing, my HD5870 is in OpenCL full load 24/7. Maybe it will die in 15 years instead of 20, but who cares? Peoples keep saying the same thing for the cpu too, I have an old P4 overclocked since 12 years which I still use daily.
    ## VGA ##
    AMD: X1950XTX, HD3870, HD5870
    Intel: GMA45, HD3000 (Core i5 2500K)

  4. #34
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,386

    Default

    Quote Originally Posted by evolution View Post
    Furthermore, I'm also expecting that the "default" profile will stop using always the maximum frequency of the GPU, because that kills the GPU lifetime. (Nouveau does the opposite, btw...)
    The "default" profile uses the settings marked as "default" in the VBIOS. Some cards have high default settings in the VBIOS, others have low default settings.

  5. #35
    Join Date
    Dec 2007
    Posts
    2,329

    Default

    The default profile uses the clocks that are set by the vbios at boot; i.e., the clocks that are set when you boot the computer before the OS loads. As Bridgman noted, on some boards they are higher, on others they are lower. Whether or not they are the same as the high clocks varies from board to board. Nouveau does the same thing. The only difference is that nvidia tends to set the boot up clocks lower.

  6. #36
    Join Date
    Nov 2009
    Location
    Italy
    Posts
    907

    Default

    By the way, just the usual reminder
    Micheal please stop benchmarking APUs with low clocks
    ## VGA ##
    AMD: X1950XTX, HD3870, HD5870
    Intel: GMA45, HD3000 (Core i5 2500K)

  7. #37
    Join Date
    Jul 2008
    Posts
    1,720

    Default

    Quote Originally Posted by evolution View Post
    Well, in opposition to the hype I've seen in the article, I'll take a more "conservative" approach: when 3.5 stable arrives, I'll give a new try to the FOSS ATI drivers (I went back to catalyst because of VAAPI and proper PM). From what I've seen, we'll have better 3D performance from now on (but does that apply to r600/r700 cards? I didn't see any card of that generation tested in the article...).

    Furthermore, I'm also expecting that the "default" profile will stop using always the maximum frequency of the GPU, because that kills the GPU lifetime. (Nouveau does the opposite, btw...)

    Finally, in the medium/long run, It'd be nice to have H.264 VDPAU/VAAPI/UVD acceleration. That wold be nice for those who still have weak CPUs (e.g. AMD E-350/Low-End Llanos/Nehalmen Core2Duos).

    Cheers
    unlike nvidia (google bumpgate), always maximum clock has no influence on the lifetime of the gpu.

    It might have an influence on the power circuitry... but you said gpu

  8. #38
    Join Date
    Feb 2008
    Location
    Linuxland
    Posts
    4,995

    Default

    Quote Originally Posted by agd5f View Post
    Also, gddr cards will always perform better than ddr cards, so if you want better performance (regardless of driver improvements) from a 5450 or 6450, get the gddr version.
    The gddr versions also use 5-15W more power, hint hint

  9. #39
    Join Date
    Jun 2012
    Posts
    293

    Default

    Quote Originally Posted by energyman View Post
    unlike nvidia (google bumpgate), always maximum clock has no influence on the lifetime of the gpu.

    It might have an influence on the power circuitry... but you said gpu
    At very least it produces lot of heat and enough of cooler noise from any more or less powerful GPUs. Which isn't good either. Why not enable smth like dynpm by default? CPU scales frequency by default almost anywhere for ages. Maybe it's time for GPUs as well, esp. powerful ones?

  10. #40
    Join Date
    Oct 2008
    Posts
    3,038

    Default

    Quote Originally Posted by 0xBADCODE View Post
    At very least it produces lot of heat and enough of cooler noise from any more or less powerful GPUs. Which isn't good either. Why not enable smth like dynpm by default? CPU scales frequency by default almost anywhere for ages. Maybe it's time for GPUs as well, esp. powerful ones?
    Because it's still buggy, and someone has to fix it.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •