Does CPU temperature affect power consumtion?
Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee, Devonavar
Does CPU temperature affect power consumtion?
Wondering if I get a better heatsink for my CPU or GPU so it runs a lot cooler, will that help my power consumption?
-
- Posts: 273
- Joined: Tue Aug 04, 2009 7:03 am
- Location: Brooklyn, NY
-
- Patron of SPCR
- Posts: 857
- Joined: Fri Dec 27, 2002 1:49 pm
- Location: Somerset, WI - USA
- Contact:
I'm guessing he's talking about the fact that there is a measureable change in power consumption between hot and cool VRMs. SPCR did an article about this I think. Can't seem to find it right now. The question posed seems to ask if that extends to the CPU temp as well... My thought is probably not as much as the VRMs and only if you have a really hot CPU under very heavy load. And even then it's probably not much. But that's just my guess.
-
- Posts: 1608
- Joined: Tue Jan 04, 2005 4:02 pm
- Location: United States
Yes. The hotter it is, the higher the resistance and the more power it consumes. I think the difference is very tiny, though. You're talking about maybe a couple watts difference over the usual temperature range for a CPU or GPU. I'd swear I remember reading an Xbitlabs (I think) article where they actually measured this on a GPU, but of course I can't find it now.
-
- Site Admin
- Posts: 12285
- Joined: Sun Aug 11, 2002 3:26 pm
- Location: Vancouver, BC, Canada
- Contact:
Power consumption is affected not only by the temp of the CPU but also peripheral components such as the VRM and the northbridge chip. Just how much this can vary depends on the particular parts, and how great the change is. With the hottest CPUs running at full artificial load (prime95, etc), the difference between the heatsink fan running at 12V vs 5V (assuming a Nexus 120 or similar) can be >10W at the AC outlet. However, if we're comparing two HSF at full speed, one a bit better than the other, then the difference would be much smaller -- as others have said, not more than a few watts.
-
- Site Admin
- Posts: 12285
- Joined: Sun Aug 11, 2002 3:26 pm
- Location: Vancouver, BC, Canada
- Contact:
I don't think such a scenario is realistic. When the CPU temp goes up, almost invariably, the VRM temp goes up -- my observation after nearly a decade of CPU HSF testing. Also, the power draw of a 1500rpm fan is usually under 2W at full tilt, and running it at half the speed probably saves no more than a watt. I think this would be less than the difference in power draw with the CPU running hotter/cooler -- tho that depends on what the CPU temp difference is.Olle P wrote:Given that the temps on VRM and NB stays the same, one can wonder how variations in CPU fan speed effects the combined power consumption of the CPU and the fan...
Any savings gained by a cooler CPU might be cancelled out by more power fed to the fan, and vice versa.
Cheers
Olle
This page from an article at techPowerUp shows that GPU power draw depends on the cooling quality, IMO due to VRM efficiency:
http://www.techpowerup.com/reviews/Zota ... on/27.html
For lower powered cards the difference may be much lower though.
http://www.techpowerup.com/reviews/Zota ... on/27.html
For lower powered cards the difference may be much lower though.
-
- Posts: 21
- Joined: Thu Jan 07, 2010 12:05 am
- Location: San Francisco
It's not obvious to me whether temperature would change CPU efficiency - as long as the CPU is running at a constant voltage. If it were just a matter of the increased resistance, the power loss should go down as the temperature goes up (until the CPU stops working because appropriate gate voltage levels aren't achieved).
On the other hand, if using a better cooler allows you to run the CPU at a lower voltage, then clearly, you'll get some savings.
On the other hand, if using a better cooler allows you to run the CPU at a lower voltage, then clearly, you'll get some savings.
-
- Posts: 3
- Joined: Sun Jun 20, 2010 11:48 pm
- Location: US