Does CPU temperature affect power consumtion?

PSUs: The source of DC power for all components in the PC & often a big noise source.

Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee, Devonavar

Post Reply
spcr2u
Posts: 9
Joined: Wed Jun 16, 2010 2:05 pm
Location: Winnipeg, Manitoba, Canada

Does CPU temperature affect power consumtion?

Post by spcr2u » Wed Jun 16, 2010 2:07 pm

Wondering if I get a better heatsink for my CPU or GPU so it runs a lot cooler, will that help my power consumption?

b_rubenstein
Posts: 273
Joined: Tue Aug 04, 2009 7:03 am
Location: Brooklyn, NY

Post by b_rubenstein » Wed Jun 16, 2010 5:46 pm

Only if it transfers enough heat into your room in the winter to cut your heating bill. Otherwise the amount of power saved would never begin to pay for the heatsink.

BillyBuerger
Patron of SPCR
Posts: 857
Joined: Fri Dec 27, 2002 1:49 pm
Location: Somerset, WI - USA
Contact:

Post by BillyBuerger » Thu Jun 17, 2010 2:50 am

I'm guessing he's talking about the fact that there is a measureable change in power consumption between hot and cool VRMs. SPCR did an article about this I think. Can't seem to find it right now. The question posed seems to ask if that extends to the CPU temp as well... My thought is probably not as much as the VRMs and only if you have a really hot CPU under very heavy load. And even then it's probably not much. But that's just my guess.

frostedflakes
Posts: 1608
Joined: Tue Jan 04, 2005 4:02 pm
Location: United States

Post by frostedflakes » Thu Jun 17, 2010 6:37 am

Yes. The hotter it is, the higher the resistance and the more power it consumes. I think the difference is very tiny, though. You're talking about maybe a couple watts difference over the usual temperature range for a CPU or GPU. I'd swear I remember reading an Xbitlabs (I think) article where they actually measured this on a GPU, but of course I can't find it now.

MikeC
Site Admin
Posts: 12285
Joined: Sun Aug 11, 2002 3:26 pm
Location: Vancouver, BC, Canada
Contact:

Post by MikeC » Thu Jun 17, 2010 7:04 am

Power consumption is affected not only by the temp of the CPU but also peripheral components such as the VRM and the northbridge chip. Just how much this can vary depends on the particular parts, and how great the change is. With the hottest CPUs running at full artificial load (prime95, etc), the difference between the heatsink fan running at 12V vs 5V (assuming a Nexus 120 or similar) can be >10W at the AC outlet. However, if we're comparing two HSF at full speed, one a bit better than the other, then the difference would be much smaller -- as others have said, not more than a few watts.

Olle P
Posts: 711
Joined: Tue Nov 04, 2008 6:03 am
Location: Sweden

Post by Olle P » Fri Jun 18, 2010 3:21 am

Given that the temps on VRM and NB stays the same, one can wonder how variations in CPU fan speed effects the combined power consumption of the CPU and the fan...
Any savings gained by a cooler CPU might be cancelled out by more power fed to the fan, and vice versa.

Cheers
Olle

MikeC
Site Admin
Posts: 12285
Joined: Sun Aug 11, 2002 3:26 pm
Location: Vancouver, BC, Canada
Contact:

Post by MikeC » Fri Jun 18, 2010 6:50 am

Olle P wrote:Given that the temps on VRM and NB stays the same, one can wonder how variations in CPU fan speed effects the combined power consumption of the CPU and the fan...
Any savings gained by a cooler CPU might be cancelled out by more power fed to the fan, and vice versa.

Cheers
Olle
I don't think such a scenario is realistic. When the CPU temp goes up, almost invariably, the VRM temp goes up -- my observation after nearly a decade of CPU HSF testing. Also, the power draw of a 1500rpm fan is usually under 2W at full tilt, and running it at half the speed probably saves no more than a watt. I think this would be less than the difference in power draw with the CPU running hotter/cooler -- tho that depends on what the CPU temp difference is.

Tzupy
*Lifetime Patron*
Posts: 1561
Joined: Wed Jan 12, 2005 10:47 am
Location: Bucharest, Romania

Post by Tzupy » Sun Jun 20, 2010 10:45 am

This page from an article at techPowerUp shows that GPU power draw depends on the cooling quality, IMO due to VRM efficiency:
http://www.techpowerup.com/reviews/Zota ... on/27.html
For lower powered cards the difference may be much lower though.

dancingsnails
Posts: 21
Joined: Thu Jan 07, 2010 12:05 am
Location: San Francisco

Post by dancingsnails » Sun Jun 20, 2010 12:09 pm

It's not obvious to me whether temperature would change CPU efficiency - as long as the CPU is running at a constant voltage. If it were just a matter of the increased resistance, the power loss should go down as the temperature goes up (until the CPU stops working because appropriate gate voltage levels aren't achieved).

On the other hand, if using a better cooler allows you to run the CPU at a lower voltage, then clearly, you'll get some savings.

wiliamsmith10
Posts: 3
Joined: Sun Jun 20, 2010 11:48 pm
Location: US

Post by wiliamsmith10 » Sun Jun 20, 2010 11:54 pm

I dont think it help you to power consumption.
because it is just for keep cool your CPU so there are not any consult power consumption to heat sink cooler.

alecmg
Posts: 204
Joined: Thu Mar 13, 2008 5:56 am
Location: Estonia

Post by alecmg » Mon Jun 21, 2010 1:19 am

There could be a thin line you can cross.
Namely lower cpu voltage by having better cooling.

Say normal circumstance cpu runs at 4GHz 1.35V, but by lowering the temp 10C it will be enough to have 1.325V. And consequently power consumption will drop significantly.

MiKeLezZ
Posts: 110
Joined: Sun Feb 20, 2005 8:00 am
Location: ITALY
Contact:

Post by MiKeLezZ » Wed Jul 14, 2010 1:40 pm

P=I²R

Where:
P = Power
I = Current
R = Resistance (and it increases with temperature)

That's all folks

Post Reply