Reducing power consumption thru cooling?

Ecological issues around computing. This is an experimental forum.

Moderators: Ralf Hutter, Lawrence Lee

Post Reply
JonScaife
Posts: 28
Joined: Thu Jan 06, 2005 10:09 am
Location: Sheffield, UK
Contact:

Reducing power consumption thru cooling?

Post by JonScaife » Mon Dec 13, 2010 8:51 am

The following thought/question occurred to me earlier, and I haven't been able to find an adequate answer. It's the sort of thing I'd expect to find on SPCR, so I wondered if anyone has any thoughts...

a simple physics experiment shows that increasing the temperature of a wire increases resistance, and therefore will decrease power draw from the source*. The semi-conductors in PC hardware actually work the opposite way, so increasing their temperature decreases resistance, and therefore increases power draw from the source*.

I've no problem with the theory, but it tells me nothing about quantitative results. So...

Does reducing the temperature of a semi-conductor (e.g. CPU or GPU in our case) result in an appreciable lowering of power "consumption"?

Or, to put some practical idea behind it - if i replace a stock cpu heatsink with a huge great tower cooler, and thereby drop the temperature of the cpu, all else being the same, what happens to my power consumption?

As is often the case, I imagine any full answer-explanation will not be simple...

* this assumes a low (relative to the device being powered) internal resistance of the source, which is normal.

ntavlas
Posts: 811
Joined: Mon Jul 16, 2007 2:35 pm
Location: Greece
Contact:

Re: Reducing power consumption thru cooling?

Post by ntavlas » Mon Dec 13, 2010 9:56 am

a simple physics experiment shows that increasing the temperature of a wire increases resistance, and therefore will decrease power draw from the source*. The semi-conductors in PC hardware actually work the opposite way, so increasing their temperature decreases resistance, and therefore increases power draw from the source*.
I`m not sure if higher resistance would translate into lower power draw in practice, the power supply feeding the cpu would make sure it gets those 5/50/whatever amps the cpu needs. Higher resistance would result in higher power draw because of energy loss and also because the on board power supply would have to work harder.

You could take a look at a recent GPU cooler review done at SPCR: http://www.silentpcreview.com/article1103-page5.html. Near the bottom of the page you can see how the power circuitry temperatures can affect power consumption.

Concerning your example of replacing the stock cooler with a giant tower heatsink: it might actually result in higher power use because the tower cooler doesn`t do much to cool the vrms surrounding the cpu. Now, I don`t know if lowering the temps of the chip itself would have an adverse effect (but I don`t remember seeing any tests that point to that direction). Either way, I`ll leave this question unanswered as semiconductors are way out of my league.

[edit]: After looking again at the temperature chart I linked, I can see that the chip temperature also seems to affect power consumption.

JonScaife
Posts: 28
Joined: Thu Jan 06, 2005 10:09 am
Location: Sheffield, UK
Contact:

Re: Reducing power consumption thru cooling?

Post by JonScaife » Mon Dec 13, 2010 10:51 am

ntavlas wrote: I`m not sure if higher resistance would translate into lower power draw in practice, the power supply feeding the cpu would make sure it gets those 5/50/whatever amps the cpu needs. Higher resistance would result in higher power draw because of energy loss and also because the on board power supply would have to work harder.

You could take a look at a recent GPU cooler review done at SPCR: http://www.silentpcreview.com/article1103-page5.html. Near the bottom of the page you can see how the power circuitry temperatures can affect power consumption.

Concerning your example of replacing the stock cooler with a giant tower heatsink: it might actually result in higher power use because the tower cooler doesn`t do much to cool the vrms surrounding the cpu. Now, I don`t know if lowering the temps of the chip itself would have an adverse effect (but I don`t remember seeing any tests that point to that direction). Either way, I`ll leave this question unanswered as semiconductors are way out of my league.

[edit]: After looking again at the temperature chart I linked, I can see that the chip temperature also seems to affect power consumption.
Higher resistance results in lower power draw in normal conductors. Hence why shorting a battery with a a wire (very low resistance) results quite quickly in a dead battery (or, quite possibly a fire). It also explains why opening (switching off) a switch (and thereby creating ~infinite resistance) results in 0 power draw :)

I don't profess to understand the complexities of modern PC PSU's but they're subject to the laws of physics :) Amps are a measure of current. Power is measured in Watts.

Good point with regard to the voltage regulators, in practice this may come into it. I did say all other things being the same though, so for the sake or argument lets assume the airflow over the regs doesn't change.

Lowering the temps of the chip wont have an adverse effect, it can only have a positive one. I am interested to know if this positive effect is negligible or valuable :)

Basically what I'm interested is this... Look at http://www.silentpcreview.com/article1128-page6.html (for example), where there are 2 coolers, both tower style, with the same fan, where 1 cooler lowers the CPU temp by 15 degrees relative to the other, I would be interested in the (at the wall) power draw of the system in each configuration.

I imagine this hasn't been covered because SPCR is more focused on low power draw => low temp => low noise. I'm interested to know if low temp => low power draw => lower energy bills

CA_Steve
Moderator
Posts: 7650
Joined: Thu Oct 06, 2005 4:36 am
Location: St. Louis, MO

Re: Reducing power consumption thru cooling?

Post by CA_Steve » Mon Dec 13, 2010 4:40 pm

In general, lowering the temperature can reduce the power consumption. Two specifics:
- leakage current in CMOS circuitry increases with temperature. CPU and GPUs have a billion transistors now and leakage current is a first order effect.
- VRM circuitry becomes less efficient as temperature rises. Less efficiency, means more heat generated, which leads to higher temps, less eifficient, etc.

Look for a review that compares two coolers, one that adequately cools the vrm circuitry and one that doesn't, but have similar CPU temps. Compare the power used.

scdr
Posts: 336
Joined: Thu Oct 21, 2004 4:49 pm
Location: Upper left hand corner, USA

Re: Reducing power consumption thru cooling?

Post by scdr » Tue Dec 14, 2010 12:25 am

Another area where you might look for data is power supply (PSU) reviews.
In particular, the 80 PLUS certification has been criticized (both here and
elsewhere) for using a fixed low ambient temperature for testing PSU efficiency.

See for example:
80 Plus expands podium for Bronze, Silver & Gold
http://www.silentpcreview.com/article814-page1.html

Can We Trust the 80 Plus Certification?
http://www.hardwaresecrets.com/article/ ... cation/856

One might look for reviews that compare PSU performance at realistic operating temperatures, vs. open air. (It would be interesting to get a ballpark estimate of how much less efficient PSUs are with heat increase. i.e. how much of a deception this measuring practice is)
I imagine this hasn't been covered because SPCR is more focused on low power draw => low temp => low noise. I'm interested to know if low temp => low power draw => lower energy bills
It seems unlikely that lower temperature could have an appreciable effect on energy bill for a typical home user. Computers just aren't that big a portion of most people's energy use.
If you look at servers or offices, (where you are considering large numbers of machines) then the sequence is still probably low power draw => lower temp (well, less heat really) => lower energy bills. Most such installations use some form of active cooling for their computers, and in many climates, cooling takes more energy than heating, so reducing cooling needs reduces energy use.

JonScaife
Posts: 28
Joined: Thu Jan 06, 2005 10:09 am
Location: Sheffield, UK
Contact:

Re: Reducing power consumption thru cooling?

Post by JonScaife » Thu Dec 16, 2010 3:13 am

OK, we've got a range of interesting opinions, but still no hard numbers being offered...

on the heatsink testing methodology it says that power consumption is measured (to give an idea of efficiency of VRMs) but I don't see any numbers on any of the reviews! Which is a shame.

Even if it turns out that cooling the CPU doesn't appreciably reduce power draw, it looks like cooling VMRs might. Which prompts the question - is it worth getting some little memory heatsinks from ebay to stick on the VRMs (especially when using a tower cooler) to reduce energy use...

CA_Steve
Moderator
Posts: 7650
Joined: Thu Oct 06, 2005 4:36 am
Location: St. Louis, MO

Re: Reducing power consumption thru cooling?

Post by CA_Steve » Thu Dec 16, 2010 8:59 am

JonScaife wrote:OK, we've got a range of interesting opinions, but still no hard numbers being offered...
Take a look at the System Measurements chart for the SFF Gaming System Lawrence built. See how the power usage changes with airflow/temps?

Regarding VRM heatsinks: I think it's a your-mileage-may-vary situation depending on your particular build. It can't hurt to add them and they may well payback the cost in saved power. Do a before and after experiment and let us know the results :D

JonScaife
Posts: 28
Joined: Thu Jan 06, 2005 10:09 am
Location: Sheffield, UK
Contact:

Re: Reducing power consumption thru cooling?

Post by JonScaife » Thu Dec 16, 2010 9:51 am

CA_Steve wrote:
JonScaife wrote:OK, we've got a range of interesting opinions, but still no hard numbers being offered...
Take a look at the System Measurements chart for the SFF Gaming System Lawrence built. See how the power usage changes with airflow/temps?

Regarding VRM heatsinks: I think it's a your-mileage-may-vary situation depending on your particular build. It can't hurt to add them and they may well payback the cost in saved power. Do a before and after experiment and let us know the results :D
Good spot! Cheers

So there was a total system decrease of 4W from an approximate drop in GPU & GPU-VRM temp of 10C.

There is also the 2V increase in voltage to the fans to take into account in the example given. Using P = IV we have
P = 0.23*2 for the CPU = 0.46W
and
P = 0.25(guesstimated)*2*2 (2 fans) = 1W

So we're probably looking at a 5.5W decrease in power draw as a result of superior cooling in this scenario, unless I'm missing something (which is quite possible)

If we assume that an additional 4W can be saved at the CPU by reducing the CPU and CPU VRM temps by 10C then we're looking at almost 10W saving - pretty good. About £9 a year for a 24.7 system at present UK electricity prices, lol

The next question (assuming the above holds) is does this saving continue to scale as temps continue to go down?...

Looks like I'll have to get a reasonably accurate power meter and do some experiments over the xmas period :)

correction---

I slightly misread the table linked to. In fact there was an 8W reduction by reducing GPU + GPU-VRM temps by 13 and 9C respectively. Despite the approx 1W extra used by the 2 GPU fans. So a 9W saving was had just from the GPU (albeit at full load). Cranking the CPU fan up (which should only use 0.5W) made little difference to the temps, but increased overall power draw by 4W thanks to increasing the GPU temp.

There is only 1 thing to do I guess...

scdr
Posts: 336
Joined: Thu Oct 21, 2004 4:49 pm
Location: Upper left hand corner, USA

Re: Reducing power consumption thru cooling?

Post by scdr » Thu Dec 16, 2010 9:57 pm

JonScaife wrote:
If we assume that an additional 4W can be saved at the CPU by reducing the CPU and CPU VRM temps by 10C then we're looking at almost 10W saving - pretty good. About £9 a year for a 24.7 system at present UK electricity prices, lol

The next question (assuming the above holds) is does this saving continue to scale as temps continue to go down?...

Looks like I'll have to get a reasonably accurate power meter and do some experiments over the xmas period :)

correction---

I slightly misread the table linked to. In fact there was an 8W reduction by reducing GPU + GPU-VRM temps by 13 and 9C respectively. Despite the approx 1W extra used by the 2 GPU fans. So a 9W saving was had just from the GPU (albeit at full load). Cranking the CPU fan up (which should only use 0.5W) made little difference to the temps, but increased overall power draw by 4W thanks to increasing the GPU temp.

There is only 1 thing to do I guess...
Too bad they didn't measure motherboard VRM temperatures and PSU temperatures, so can't tell whether changes in those temperatures may have been responsible for some of the changes in draw. (Obviously the PSU would be putting out fewer watts, so would reduce temperature some, but as tight as the space appears to be it might be that changing CPU airflow affected PSU airflow.)

Of course the savings here was at heavy load (which is not a usual situation for most computers), would expect temperatures and power use at idle (where most systems spend most of time and energy) to be much lower, so less scope for temperature reduction, and benefits of extra cooling may be harder to recoup.

Interesting to see what further research/experiment shows.

CA_Steve
Moderator
Posts: 7650
Joined: Thu Oct 06, 2005 4:36 am
Location: St. Louis, MO

Re: Reducing power consumption thru cooling?

Post by CA_Steve » Fri Dec 17, 2010 7:38 am

Another thing to think about - how well is the motherboard VRM circuitry designed and how much extra crap is on the mobo I might not need? There is a lot of variability in idle/load use between motherboard designs. Here are two reviews from Anandtech comparing 3-5 motherboards. The first is based on H55 Mini-ITX, the second on X58. There is 3W idle and 13W load difference amongst the H55 mobos, and 44W idle/45W load diff on the X58 platforms reviewed.

scdr
Posts: 336
Joined: Thu Oct 21, 2004 4:49 pm
Location: Upper left hand corner, USA

Re: Reducing power consumption thru cooling?

Post by scdr » Fri Dec 17, 2010 1:43 pm

I was thinking a bit more about this - seems like a simple test would be to take a whole computer, run it in a room with low ambient temperature (e.g. unheated room at night in winter). Measure power draw a few times.

Run same task inside heated room (maybe run temperature up a bit). Measure power draw a few times.

Should be easy to get a 20+ degree F temperature difference. As long as don't have thermally controlled fans, that should keep other factors equal and give a rough idea of how much draw depends on temperature. (Then if find much of an effect can try to track down further in terms of cooling what components give most effect.)

Post Reply