Page 1 of 4 123 ... LastLast
Results 1 to 10 of 33

Thread: AMD processors/-based systems power hunger

  1. #1
    Join Date
    Apr 2010
    Posts
    1,946

    Default AMD processors/-based systems power hunger

    I'm big AMD supporter, by major factor because of their opensource policy. However no-matter which processor comparison test I look, AMD systems are always draining a lot of power.

    For desktop - about 15-20 Watts more in idle, and about 20-40 Watts more under load.


    For example
    , lets us take Athlon II X4 630 and Intel core i5-750.
    A almost cache-less 630 consumes way more juice than 750, whilst also performing worser. In my country juice is not cheap and eventually, when using the mentioned processors in long 2-3 year window, renders 750 cheaper in total cost than athlon II x4.

    I found a way to undervolt my athlon II x4 630, from ridiculously high 1.4 volt Vcore (only found on full-blown Phenom II) to 1.25v, leading to consumption drop of around 25 watts in load (120w instead of 145w) and 10 in idle(100w->90w), with zero impact on stability. The logic of my mainboard allowed to reduce via percentage, not value, so that reduction scales down very well when CPU is going into cool'n'quiet mode.

    Prior to my switch from intel e5300/gf9800gt to full amd system, I had an opportunity to play with intels Speedstep, which basically reduced CPU multiplier to 6. Both cores still run @ 1.2 Ghz, where on Athlon II x4 due to CoolnQuiet(and on-demand governor) all four run at just 0,8Mhz with Vcore reduced too.

    My main questions are:

    1) Why is AMD K10 draining so much more in performance per watt than Core or even Core2Duo? Whats the reason behind so much difference?
    2) Will there be any change with Bulldozer?
    3) Why is Athlon II X4 spec'ed at 1.4v when it runs with 1.25v (or even 1.20v if u do internet search) just fine?

    Please no intel fanboyism. Ty.

  2. #2
    Join Date
    Oct 2009
    Posts
    2,110

    Default

    Quote Originally Posted by crazycheese View Post
    For example[/URL], lets us take Athlon II X4 630 and Intel core i5-750.
    A almost cache-less 630 consumes way more juice than 750, whilst also performing worser. In my country juice is not cheap and eventually, when using the mentioned processors in long 2-3 year window, renders 750 cheaper in total cost than athlon II x4.

    I found a way to undervolt my athlon II x4 630, from ridiculously high 1.4 volt Vcore (only found on full-blown Phenom II) to 1.25v, leading to consumption drop of around 25 watts in load (120w instead of 145w) and 10 in idle(100w->90w), with zero impact on stability. The logic of my mainboard allowed to reduce via percentage, not value, so that reduction scales down very well when CPU is going into cool'n'quiet mode.
    Your numbers don't make sense. That is a 125 watt chip. It can't eat 145 unless you are overclocking/overvolting. If it is, then something else is really nuts.

    Idle power consumption of that chip should be around 10-15 watts, not 90-100. That's just crazy.

    Unless you're measuring full system power consumption... in which case you have other things to think about than just the CPU.... the chipset and graphics card for example. An intel system will typically have an intel GPU, which is super weakness and probably doesn't eat much power... so on a wall socket power consumption measurement, an intel system, even with a CPU that eats more power, might still have a lower "full system" power consumption.

    Note: according to this: http://www.behardware.com/articles/7...0-and-630.html --- that athlon 630 draws 12.6 at idle and 80.4 full out.

    1) Why is AMD K10 draining so much more in performance per watt than Core or even Core2Duo? Whats the reason behind so much difference?
    The fact that there is more to the story than the CPU.

    2) Will there be any change with Bulldozer?
    Again, more to the story than just the CPU!!!

    3) Why is Athlon II X4 spec'ed at 1.4v when it runs with 1.25v (or even 1.20v if u do internet search) just fine?
    Error margin to ensure maximum stability for everyone, including chips that are a little "out"... improves yields and saves money (for them).

  3. #3
    Join Date
    Apr 2010
    Posts
    1,946

    Default

    Quote Originally Posted by droidhacker View Post
    Your numbers don't make sense. That is a 125 watt chip. It can't eat 145 unless you are overclocking/overvolting. If it is, then something else is really nuts.

    Idle power consumption of that chip should be around 10-15 watts, not 90-100. That's just crazy.

    Unless you're measuring full system power consumption... in which case you have other things to think about than just the CPU.... the chipset and graphics card for example. An intel system will typically have an intel GPU, which is super weakness and probably doesn't eat much power... so on a wall socket power consumption measurement, an intel system, even with a CPU that eats more power, might still have a lower "full system" power consumption.

    Note: according to this: http://www.behardware.com/articles/7...0-and-630.html --- that athlon 630 draws 12.6 at idle and 80.4 full out.


    The fact that there is more to the story than the CPU.


    Again, more to the story than just the CPU!!!


    Error margin to ensure maximum stability for everyone, including chips that are a little "out"... improves yields and saves money (for them).
    Hi, thanks for reply! Of course its the whole idle system drain/cpu load whole system drain. The thing is Intel is draining way less and esp. in idle state, but also in load state. If you take a look at the link I provided in the first post, you will see that core 750 is draining less than athlon II x4 on full load and waay less in idle. In fact core i3-530 and core i7-870 drain same in idle, where core i7-9xx drain more due to additional memory channel(if my reading are correct).

    No way can a chip be so much overclocked for stability reasons. I mind you that Athlon II x4 concept was first normal Phenom II with disabled cache, but later they introduced new smaller cores, not just old one with disabled. The newer cores were physically Phenom II but with cache L3 physically absent. I think they either do this high drain on purpose, or they just forgot and don't care.

    I think the reason to lower core-i drain lies within ability to shut off individual cores permanently instead of driving them at lower settings.

    It would be nice if another feature of Bulldozer would be, also, revamped idle and load power management scheme.

    Its just that burning electricity is not that fun, as burning rubber.

  4. #4
    Join Date
    Nov 2008
    Location
    Germany
    Posts
    5,411

    Default

    one part is the die selection the desktop users allways get the worst/bad dies..

    the notebook and server customers get the best dies.

    exampel you can buy an 8core opteron with 80watt tdp

    but an desktop dualcore burns more than 80 watt...

    and the opteron is 3-4 times faster on "7zip"

    only the chipsets are sometimes better on the desktop side.

    if you really wana have an perfekt speed per watt buy an AMD-Fusion 4core+APU in 2011..

    no system can beat that in powerconsuming

  5. #5
    Join Date
    Nov 2008
    Location
    Germany
    Posts
    5,411

    Default

    "1) Why is AMD K10 draining so much more in performance per watt than Core or even Core2Duo? Whats the reason behind so much difference?"

    intel have cpus with no pretesting selection means you can get a good one if you are lucky

    amd pretests every die and the good ones goes into Opteron servers and Notebooks and the bad one goes into the desktop system


    "2) Will there be any change with Bulldozer?"

    not really... if you wana have an more powerfull system per watt buy an opteron system or an notebook.

    the desktop customers just don't care.

    but you also can build an power saving amd system with the money you save on buying an amd cpu you can buy an better ATX power unit for your pc or an SSD this saves more than the CPU---

    worst atx power units do have 80-85% the best do have 93%

    an 3,5" hdd consums 15-20watt an SSD only 0,5 watt.

    an good tft uses 45 watt and a bad one 100watt ....

    you really can get an good system without burn money on intel CPUs.

    and an other way is just buy an 6 core and downclock it to 2ghz

  6. #6
    Join Date
    Nov 2006
    Location
    /home/$USER
    Posts
    113

    Default

    Quote Originally Posted by crazycheese View Post
    0,8Mhz with Vcore reduced too.

    My main questions are:

    1) Why is AMD K10 draining so much more in performance per watt than Core or even Core2Duo? Whats the reason behind so much difference?
    Looking at the actual figures, it looks like you have something other than the CPU consuming most of the power in the system or you have a PSU that is horrendously inefficient. Your power draw of the system at the wall before you undervolted it was 145 W and 100 W at full load and idle. A modern CPU at idle takes somewhere between 5 and 20 watts according to people who've hooked ammeters to the CPU power cables (e.g. Lost Circuits) and tested them. The fact that the difference between idle and load power draws were 30-45 watts says there's something that consumes power irrespective of the CPU (or its VRM circuitry.) The biggest offenders are GPUs that fail to clock down fully when they idle, particularly if you have two monitors attached to the GPU. I know my GTS250 clocks down to 300 core/100 memory at idle with one monitor attached but always runs at the full 738/1100 with both of my monitors attached. Other things that can suck power are a bunch of HDDs, some peripheral cards like higher-end SATA/SAS controllers, fans, lights, and inefficient chipsets. Also, a power supply that is of poor quality or is old will be far less efficient than newer models. Modern PSUs are 80-90+% efficient, while old ones tend to be somewhere between 55% and 70% efficient. If you have an enormous PSU like a 1000 W+ unit and you are only drawing 100 W or so from it, efficiency also can be poor even if the PSU is pretty efficient when being run with a reasonable load for its size.

    2) Will there be any change with Bulldozer?
    I bet power management will be even better, but your system seems to have most of its power drawn by something other than the CPU, so I doubt changing to a Bulldozer-based CPU will help you much.

    3) Why is Athlon II X4 spec'ed at 1.4v when it runs with 1.25v (or even 1.20v if u do internet search) just fine?
    It is specced that way so that AMD can make money selling Athlon II X4s for roughly a hundred bucks. The percentage of dies that will reliably run at a given speed decreases as you lower the voltage, so AMD set the voltage level fairly high to ensure very high yields. Higher yields = the lower the price AMD can sell the chips at and still make the amount of money they need to. They could certainly lower the Vcore on most of the chips and be fine, but they'd end up with some that they would have to turn into X3s or discard than they can currently sell with the 1.40 Vcore. Your $99 Athlon II X4 640 would end up costing more than $99, perhaps significantly so.

  7. #7
    Join Date
    Nov 2006
    Location
    /home/$USER
    Posts
    113

    Default

    Quote Originally Posted by droidhacker View Post
    Your numbers don't make sense. That is a 125 watt chip. It can't eat 145 unless you are overclocking/overvolting. If it is, then something else is really nuts.

    Idle power consumption of that chip should be around 10-15 watts, not 90-100. That's just crazy.
    The Athlon II series have TDPs of 45, 65, or 95 watts. Only some of the Phenoms carry a 125-watt TDP rating.

  8. #8
    Join Date
    Nov 2006
    Location
    /home/$USER
    Posts
    113

    Default

    Quote Originally Posted by crazycheese View Post
    Hi, thanks for reply! Of course its the whole idle system drain/cpu load whole system drain. The thing is Intel is draining way less and esp. in idle state, but also in load state. If you take a look at the link I provided in the first post, you will see that core 750 is draining less than athlon II x4 on full load and waay less in idle. In fact core i3-530 and core i7-870 drain same in idle, where core i7-9xx drain more due to additional memory channel(if my reading are correct).
    You also have to look at the rest of the system configuration if you are measuring power draw at the outlet. The chipset on the LGA1156 models such as the i3 and i7-870 has one chip that is basically the southbridge and doesn't do much heavy I/O, so it draws little power. The chipset on the i7-9xx series and AMD CPUs have two chips, one of which (the northbridge) does a lot of heavy I/O and burns a bit of power. Also, the i7-9xx series must have a discrete GPU installed and that will add quite a bit to the at-the-wall power draw, whereas the i3 probably didn't have a discrete GPU installed since it is an IGP chip.

    No way can a chip be so much overclocked for stability reasons. I mind you that Athlon II x4 concept was first normal Phenom II with disabled cache, but later they introduced new smaller cores, not just old one with disabled. The newer cores were physically Phenom II but with cache L3 physically absent. I think they either do this high drain on purpose, or they just forgot and don't care.
    The Athlon II X4 was intended to mainly use the L3-less "Propus" die with a handful of units being made from L3-containing Phenom II X4 "Deneb" dies that have a defective L3 but four working cores. I think AMD specs for a high voltage on their chips for yield reasons (see my posts above.)

    I think the reason to lower core-i drain lies within ability to shut off individual cores permanently instead of driving them at lower settings.
    That may be part of it, but Intel also fabricates the i3s and i5-5xx/6xx units on the 32 nm process as compared to the 45 nm process of the i7s and AMD Athlon II/Phenom IIs. They also have a lot higher average selling price for their CPUs and can afford to bin a bit tighter for voltages than AMD can.

    It would be nice if another feature of Bulldozer would be, also, revamped idle and load power management scheme.
    I think there might be some of that in Bulldozer from what I've heard.

  9. #9
    Join Date
    Nov 2006
    Location
    /home/$USER
    Posts
    113

    Default

    Quote Originally Posted by Qaridarium View Post
    one part is the die selection the desktop users allways get the worst/bad dies..

    the notebook and server customers get the best dies.

    exampel you can buy an 8core opteron with 80watt tdp

    but an desktop dualcore burns more than 80 watt...

    and the opteron is 3-4 times faster on "7zip"
    The 80-watt 8-core Opterons also cost about $500, compared to about $100 for the 95-watt Phenom II X2. The rest of the desktop dual-cores consume 45 or 65 watts.

    There are also quite a few benchmarks that those Opterons will be significantly slower at than the Phenom IIs and Athlon IIs. Those 80-watt Opterons run at only 1.8 and 2.0 GHz but the desktop dual-cores go well over 3 GHz, so anything not very well threaded will be a LOT faster on the desktop chips.

    only the chipsets are sometimes better on the desktop side.
    They're pretty similar to tell the truth. The SR5690 northbridge used in the higher-end server boards is nearly identical to the desktop 890FX. The desktops get a little newer southbridge with the SB800 and its 6 Gbps SATA controller, while the servers use the SB7x0-based SP5100. The only real glaring difference is that none of the server chipsets have an IGP built into them like many of the desktop and most of the mobile chipsets do. Almost all servers have onboard graphics, but they're very low-power 2D-only units that typically hang off the PCI bus and are really designed for outputting a GUI for OS installation and management, not for workstation use.

  10. #10
    Join Date
    Nov 2008
    Location
    Germany
    Posts
    5,411

    Default

    Quote Originally Posted by MU_Engineer View Post
    The 80-watt 8-core Opterons also cost about $500, compared to about $100 for the 95-watt Phenom II X2. The rest of the desktop dual-cores consume 45 or 65 watts.

    There are also quite a few benchmarks that those Opterons will be significantly slower at than the Phenom IIs and Athlon IIs. Those 80-watt Opterons run at only 1.8 and 2.0 GHz but the desktop dual-cores go well over 3 GHz, so anything not very well threaded will be a LOT faster on the desktop chips.
    i just try to explain how amd sell hardware

    they select the best cores for the opteron CPUs and the to bad for an opteron cores are selled for desktops very cheap.
    in europe you can get an 8core opteron for 250 means not 500dollars
    300-400 dollars maybe!

    your thinking about the GHZ are just wrong

    the 8 core opteron have nearly double L3 cache per core than the desktop 6 core
    and the 8 core opteron do have quatchannel ram per socked and the desktop one only 2 channel..
    means on an modern well optimated sourcecode the opteron beat the desktop one



    Quote Originally Posted by MU_Engineer View Post
    They're pretty similar to tell the truth. The SR5690 northbridge used in the higher-end server boards is nearly identical to the desktop 890FX. The desktops get a little newer southbridge with the SB800 and its 6 Gbps SATA controller, while the servers use the SB7x0-based SP5100. The only real glaring difference is that none of the server chipsets have an IGP built into them like many of the desktop and most of the mobile chipsets do. Almost all servers have onboard graphics, but they're very low-power 2D-only units that typically hang off the PCI bus and are really designed for outputting a GUI for OS installation and management, not for workstation use.
    similar? i call you an liar in that point

    thats because my last socket f Opteron with an nforce 3600 pro chip set was worst against the Desktop chipsets

    i sell it and now i have an desktop board and now i have less bugs...

    now i can have boot partitions over the 128gb limitation

    and the nvidia MCP55/3600 chipset was worst with the catalyst driver in the past really worst.

    and you are wrong if you think the amd chipsets on the opteron side are better.

    no because the opteron chipsets are fix for 5 years now and the desktop chipsets roll out every year this year the opteron chipsets are nearly the same next year the desktop chipssetzs are better again...

    Features like PCIe3.0 or USB3.0 or Sata3

    you can not get an opteron board with usb3.0 and pcie3,0 and Sata3

    right now the opteron boards are pcie2.0 usb2.0 and sata2.0-..-.

    means no only the CPUs are better on opteron systems the chipsets are NOT!

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •