SPCR's Updated Heatsink Test Bed and Methodology

Want to talk about one of the articles in SPCR? Here's the forum for you.
ryboto
Friend of SPCR
Posts: 1439
Joined: Tue Dec 14, 2004 4:06 pm
Location: New Hampshire, US
Contact:

Post by ryboto » Wed Jan 31, 2007 10:11 am

Please keep in mind one relevant difference in how °C/W is now calculated: Previously, we used a calculated value to estimate how much power was consumed by the CPU. This time, we've measured the power it consumes; that 78W figure is not TDP — a theoretical design number. It's the actual amount of power consumed by the CPU & VRMs through the AUX12V connector.
So, you're calculating the thermal resistance by the power consumption of the CPU& VRM? Isn't that going to give us results that don't really make any sense? The heat sink is removing heat from the cpu, not the WRMs.

I'm still very confused. In the Ultra 120 review, the CPU cooled that chip slightly better/on par with the Ninja, yet in this situation, it does not. What was different in that test bed? If every variable was the same in that test station for the testing of the Ninja and Ultra 120, the temperature measurements should still be valid, regardless of how you were calculating thermal resistance.

Also, what are the reviewers thoughts on thermal paste and mounting pressure. I know some of the aftermarket coolers use screw type systems, and others use clips. Even a small change in thermal paste thickness can have an effect on the thermal resistance of the cooling system.

Edit:als I wonder how Thermalright feels now that the title of the Ultra 120 review(Thermalright Gets Back on Top with the Ultra-120) is now somewhat misleading.

Devonavar
SPCR Reviewer
Posts: 1850
Joined: Sun Sep 21, 2003 11:23 am
Location: Vancouver, BC, Canada

Post by Devonavar » Wed Jan 31, 2007 1:07 pm

So, you're calculating the thermal resistance by the power consumption of the CPU& VRM? Isn't that going to give us results that don't really make any sense? The heat sink is removing heat from the cpu, not the WRMs.
Yes, we are calculating based on the combined power of the CPU & VRMs. And, yes, I suppose this method will inevitably overestimate a bit. I'm not claiming it's a perfect method, but it's better than the alternative. Would you rather I used the 130W TDP when the whole system is only drawing 120W? Using the measured number is a clear improvement.

Besides, I don't think things are as cut and dried as you make it seem. It's not as simple as "the heatsink cools the CPU and nothing else". VRM heat has to be dissipated somehow, and the heatsink plays a role in doing so. On the other hand, no heatsink will remove all heat from the CPU; some heat will always be dissipated through other avenues, such through the motherboard. Motherboards are designed to conduct heat for just this reason.

Because of this, getting a completely accurate number for the amount of heat in the system is unfeasible. And, to be honest, it doesn't really matter that much, for the same reasons that the absolute CPU temperature doesn't matter that much. We're not looking for 100% accurate numbers, because we've accepted that we can't obtain them. However, we can get numbers that will be valid for comparisons made within our system.

Switching to the measured heat may not be perfect, but there's no question that it's more accurate than the method we used previously. It's also more repeatable, because it eliminates a previously unknown variable: The amount of "error" between the rated TDP and the actual power dissipation of our chip. It will now be easier for people to replicate our work because that variable will no longer be a source of error.

Measuring the CPU heat has further benefit: It allows us to tell when the power consumed is abnormally high, as can happen when the system is beginning to overheat. An increase in consumed power could potentially skew the results near that point, and measuring the power eliminates this source of error.
I'm still very confused. In the Ultra 120 review, the CPU cooled that chip slightly better/on par with the Ninja, yet in this situation, it does not. What was different in that test bed? If every variable was the same in that test station for the testing of the Ninja and Ultra 120, the temperature measurements should still be valid, regardless of how you were calculating thermal resistance.
I'm quite aware of how the test should have turned out, but if I could report that, there would be no point in re-testing. All I can do is report how the test did turn out. I'm in the same boat as you: I can't explain it, but neither can I ignore it, no matter how much more consistent it would make my work seem. If you can help me put my finger on what went wrong, I'd be happy to listen. However, unless you can explain it, all I can do is report what I measure, and write off unexpected variation as random variation.

ryboto
Friend of SPCR
Posts: 1439
Joined: Tue Dec 14, 2004 4:06 pm
Location: New Hampshire, US
Contact:

Post by ryboto » Wed Jan 31, 2007 3:40 pm

I didn't mean any offense in my post, I respect the work you do, as it's always more in depth than many other hardware review sites. There will always be variables that are uncontrollable, at least there are those that are impossible to completely reduce the error from, such as thermal paste application, the amount of torque applied during the installation of the heatsink.

I guess the thermal resistence numbers will never be perfect, unless you could measure directly the temperatures at different points in the heat exchange system. Say you were able to measure the temperature at the surface of the heat spreader, and the temperature at the base of the heatsink. Assuming we have 1 dimensional heat transfer, we could determine the actual heat output(well, it's approximated) from the cpu, since the thermal conductivity of thermal paste is known.

Another alternative is simply assuming the IHS is at one uniform temperature, ie. the temperature of the core as monitored by the cpu's internal diode, if you knew that temperature, you could just measure the temperature of the base of the heatsink and find the heat generated that way...i guess that's more than is really necessary though. Just thinking out loud. As far as I'm concerned, temperature data is enough for me as it's hard to quantify thermal resistance of such complex finned surfaces the heatsinks are these days. Though, your estimate is an effort I applaud, and respect, and you're right, it does serve as an adequate means for comparrison.

J. Sparrow
Posts: 414
Joined: Wed Jan 17, 2007 7:55 am
Location: EU

Post by J. Sparrow » Fri Feb 02, 2007 5:26 pm

Devonavar wrote:If you can help me put my finger on what went wrong, I'd be happy to listen. However, unless you can explain it, all I can do is report what I measure, and write off unexpected variation as random variation.
Frankly, I find it unlikely (to say the least) that someone, sitting in front of a screen on the other side of the planet, could tell you what went wrong in your test :) The suggestion about the atmospheric pressure sounds good to me, why not pick it up?

Now just pretend for a moment to be a reader: you can only wonder which test is the random one.

If you decide to go for another round of labtesting, I'd put the Noctua NH-U12 in, too (leave the XP-120 behind, if they're too many); it should be no bad performer, halfway between the Ultra-120 and the Ninja in terms of fin spacing, but a previous test here on SPCR showed it performing vastly worse (7 °C) than the other two. With all the respect due, it seems possibile that something could have gone random in that review, too.

As things are at the moment, I admire your (SPCR-wide) attempts to make good, objective tests, but I'm still really really confused about the real performance of these coolers, which should be the very point of any published reviews, anyway.

daredeshouka
Posts: 2
Joined: Fri Dec 29, 2006 8:19 pm

Post by daredeshouka » Fri Sep 07, 2007 10:45 am

I was wondering what sort of affect the positioning of the fan on the Scythe Ninja would have on the results. I've never seen one of these so forgive me if I'm making a huge mistake but with an Intel platform, can the fan be mounted on any of the 4 sides? That is my assumption. My next assumption is, it can also be mounted in any direction on LGA775 platform. So then could there have been a change between the test from the old test platform to the new one where the fan was mounted differently?

To put it in simpler terms, I notice that there's a heatsink on the base of the heatsink... in one orientation, the side of this heatsink would block airflow, but in the 90 degree offset postion, airflow would pass freely to add addition cooling. Could that explain the differences between test results?

I'm curious because I can't make up my mind on which heatsink I should get between the Ninja or the U120E. Seems like in all the tests done here, the Ninja always shows a lower differential from 12V to 5V than the U120. I've noticed the U120 usually has 11 to 13C differential, whereas most of the other coolers only show 6 or 7C, even on older TR coolers. So I find that strange. Maybe it's because the U120 fins are tightly packed and restrict airflow. I know both are great but the Ninja is marginally cheaper, where I can get the Ninja and a S-Flex12E for less than the U120E alone.

Olle P
Posts: 711
Joined: Tue Nov 04, 2008 6:03 am
Location: Sweden

Post by Olle P » Fri Dec 12, 2008 5:06 am

A couple of issues regarding your test methods...

1. Noise (I put this first since this site is dedicated to noise related issues.)
On the very first page of the test description you write (emphasis added by me.):
"What we want to know is how well a heatsink performs at a given (quiet) noise level, and using the same fan for all our tests gives us a way of reproducing the same noise level every time we do a test."

The problem is that you don't. What you do is to have the fan run at the same set of speeds for all tests. That way you know what noise the fan produce in free air.
When you attach it to a heatsink it no longer moves free air. The noise level is increased and the tonal pitch altered, different for different heatsink designs.
It would therefore be interesting to know what the sound actually is with the fan mounted and running at the given speed.
Another option is to say that you want to know how well the heatsink performs at for example 16dB(A), and then measure the noise while you adjust the fan speed until the actual noise level is 16dB(A).
__________________________________

2. Your mobo is horizontal during measurements.
Most (read: all) heatsinks come with heat pipes. These have different performance based on orientation, and although it seems natural that they'd perform the best while vertical, as they usually are in your tests, the opposite is in fact true. Tests have shown that different heatsinks show a spread in their efficiency based on orientation. The temperature difference between a horizontal and vertical mobo orientation might be up to nearly 5C, all other factors being constant.
Since the vast majority of computers have their motherboard in a vertical position, that's the way to do most measurements. Adding one or two measurements with the mobo horizontal is good for reference and to see how much the performance differs for the tested heatsink.
__________________________________

3. The other environmental factors.
- As been previously noted in this thread, the air pressure plays a part. The reason for this is that the heat is to be transferred from the fins to the "air", mostly nitrogen, molecules. The more molecules the better air transfer. Fan data usually show the air flow as volume per time unit, whereas for cooling calculations it makes better sense to calculate the flow of mass per time unit. At room temperature and 1bar pressure 1cfm air equals about 32grams/minute. The differences in temperature that we normally handle don't change this value more than in the first decimal. Differences in air pressure depending on height above sea level and current weather makes for differences in the 10% region, and should be noted. (Notice that barometers meant for meteorology are supposed to be calibrated to show the corresponding pressure at sea level, not the actual pressure at the spot where they're located. Make sure that the barometer used is calibrated to show the actual pressure on the spot.)

- Another factor is moisture. As we know, water is good at absorbing heat, which is a major reason why water cooling is so efficient. Likewise, adding water molecules to the air will make that air more efficient at cooling a heatsink. The exact relationship is unknown to me, but it's easy to at least measure the relative humidity during the tests and make a note of it.

Cheers
Olle

juamez
Posts: 45
Joined: Wed Feb 06, 2008 2:17 pm
Location: Belgium

Post by juamez » Fri Aug 21, 2009 4:30 am

Another problem with this methodology is that the heatsinks are tested in the open air. But since that's a different way than most of people will be using their heatsink (in a case, mounted vertically), this creates unrealistic conditions, which can alter the results in ways that are irrelevant to most.

So my question: can't SPCR use a couple of widely used case as a reference and repeat all the heatsink tests inside of them, with the case standing up and with the sidepanels closed? In other words: the way normal people would use their computer.

That would be very helpful, because now I look at the heatsink roundups and reviews, and from those articles I gain information on what heatsink is the best when having your mobo lying horizontally in open air, but that's not how I'm going to use it, and to rule out any possibility that results might shift when changing environment (from open air and horizontally to in a closed case vertically) I cannot use your results to make a good decision on which heatsinks would perform best under my conditions.

MikeC
Site Admin
Posts: 12285
Joined: Sun Aug 11, 2002 3:26 pm
Location: Vancouver, BC, Canada
Contact:

Post by MikeC » Fri Aug 21, 2009 7:33 am

juamez wrote:So my question: can't SPCR use a couple of widely used case as a reference and repeat all the heatsink tests inside of them, with the case standing up and with the sidepanels closed? In other words: the way normal people would use their computer.
No. Why? In a word, time.

In many words, the case would introduce a host of variables and complexities that will favor some HSF over others. Use our Temperature Rise over Ambient and apply it to your setup. If you choose the right components and set up a good airflow path, the ambient temp will be no higher than ~10C above our open air.

I disagree that the results aren't useful. The heatsinks' relative cooling capabilities are not going to change unless you choose the wrong ones to use in the wrong way -- ie a topdown squeezed too tightly in a narrow case with no room above the fan.

You go ahead and round up some HSF we've tested, try them your way and see how much your results vary. The onus is on you to prove our results aren't useful; they've helped many thousands of people for nearly 8 years.

MikeC
Site Admin
Posts: 12285
Joined: Sun Aug 11, 2002 3:26 pm
Location: Vancouver, BC, Canada
Contact:

Post by MikeC » Fri Aug 21, 2009 7:54 am

Olle P wrote:A couple of issues regarding your test methods...

1. Noise (I put this first since this site is dedicated to noise related issues.)
On the very first page of the test description you write (emphasis added by me.):
"What we want to know is how well a heatsink performs at a given (quiet) noise level, and using the same fan for all our tests gives us a way of reproducing the same noise level every time we do a test."

The problem is that you don't. What you do is to have the fan run at the same set of speeds for all tests. That way you know what noise the fan produce in free air.
When you attach it to a heatsink it no longer moves free air. The noise level is increased and the tonal pitch altered, different for different heatsink designs.
It would therefore be interesting to know what the sound actually is with the fan mounted and running at the given speed.
Another option is to say that you want to know how well the heatsink performs at for example 16dB(A), and then measure the noise while you adjust the fan speed until the actual noise level is 16dB(A).
__________________________________

2. Your mobo is horizontal during measurements.
Most (read: all) heatsinks come with heat pipes. These have different performance based on orientation, and although it seems natural that they'd perform the best while vertical, as they usually are in your tests, the opposite is in fact true. Tests have shown that different heatsinks show a spread in their efficiency based on orientation. The temperature difference between a horizontal and vertical mobo orientation might be up to nearly 5C, all other factors being constant.
Since the vast majority of computers have their motherboard in a vertical position, that's the way to do most measurements. Adding one or two measurements with the mobo horizontal is good for reference and to see how much the performance differs for the tested heatsink.
__________________________________

3. The other environmental factors.
- As been previously noted in this thread, the air pressure plays a part. The reason for this is that the heat is to be transferred from the fins to the "air", mostly nitrogen, molecules. The more molecules the better air transfer. Fan data usually show the air flow as volume per time unit, whereas for cooling calculations it makes better sense to calculate the flow of mass per time unit. At room temperature and 1bar pressure 1cfm air equals about 32grams/minute. The differences in temperature that we normally handle don't change this value more than in the first decimal. Differences in air pressure depending on height above sea level and current weather makes for differences in the 10% region, and should be noted. (Notice that barometers meant for meteorology are supposed to be calibrated to show the corresponding pressure at sea level, not the actual pressure at the spot where they're located. Make sure that the barometer used is calibrated to show the actual pressure on the spot.)

- Another factor is moisture. As we know, water is good at absorbing heat, which is a major reason why water cooling is so efficient. Likewise, adding water molecules to the air will make that air more efficient at cooling a heatsink. The exact relationship is unknown to me, but it's easy to at least measure the relative humidity during the tests and make a note of it.

Cheers
Olle
A very late response, didn't see this before.

1. The audible difference when a reference fan is mounted on one heatsink (when it can be) vs another is trivial. Even between free air and on a heatsink, it's quite small. If/when these things are not trivial, we'd mention it. Remember that the reference fans measures under 20 dBA/1m at full speed! Also, whenever the stock fan is tested/measured, it's mounted on the heatsink. Some 30~40% of HS cannot be used with our reference fans. Finally, whenever we compare heatsinks, we set up tables referenced to SPL -- ie, we don't always follow the HS test methodology slavishly. It probably needs to be updated to reflect current practice anyway.

2. Most heatsink mfgs state that there's NO performance difference between horizontal & vertical orientation, and we've checked this a few times in the past -- and confirmed the claim. Since then we have not bothered. We could check again, maybe do as you suggest, add a single test in that orientation.

3. Humidity, eh? I don't think it varies that much around here.... but I suppose we could measure and note it.

juamez
Posts: 45
Joined: Wed Feb 06, 2008 2:17 pm
Location: Belgium

Post by juamez » Tue Sep 01, 2009 9:09 am

MikeC wrote:I disagree that the results aren't useful. The heatsinks' relative cooling capabilities are not going to change unless you choose the wrong ones to use in the wrong way -- ie a topdown squeezed too tightly in a narrow case with no room above the fan.
My motivation to make my previous post came from a guy who told me that a double-fanned scythe mugen-2 isn't significantly better than a mugen-2 equipped with only one fan, because the back casefan acts as the second fan anyway.

This got me thinking and I got to the very strict scientific conclusion that: if you test your heatsinks outside a case, you cannot say with 100% certainty which heatsink will be the best inside a case. You cannot interpolate the performance of the heatsinks in your setup to expected performance inside a case, because there are too many unknown variables. Not that I say the outcome will be completely different, I'm just saying that a winning heatsink (or heatsink-fan setup) outside a case wouldn't necessarily be a winner inside a case. Do you catch my drift?

So the general consensus (of which heatsink is the coolest and the quietest) may still apply to different ambient conditions, but your keen measurements are rendered useless for people looking for the best heatsink to be used inside a case, which is rather a shame.

I understand that redoing all the tests requires a massive amount of time, so I won't ask you to do them again, but I just want your opinion on my statement.

MikeC
Site Admin
Posts: 12285
Joined: Sun Aug 11, 2002 3:26 pm
Location: Vancouver, BC, Canada
Contact:

Post by MikeC » Tue Sep 01, 2009 10:03 am

juamez wrote:So the general consensus (of which heatsink is the coolest and the quietest) may still apply to different ambient conditions, but your keen measurements are rendered useless for people looking for the best heatsink to be used inside a case, which is rather a shame.
Simple answer: You're wrong.

Cistron
Posts: 618
Joined: Fri Mar 14, 2008 5:18 am
Location: London, UK

Post by Cistron » Tue Sep 01, 2009 11:55 am

juamez wrote:Do you catch my drift?
No.

Why would the proportionate difference between heatsink performance within a case that bascially stays a constant be any different?

jessekopelman
Posts: 1406
Joined: Tue Feb 13, 2007 7:28 pm
Location: USA

Post by jessekopelman » Tue Sep 01, 2009 12:49 pm

juamez wrote:but your keen measurements are rendered useless for people looking for the best heatsink to be used inside a case
SPCR tests for a particular application -- low airflow cooling. What SPCR ranks as the best heatsink is the one that can dissipate the most heat at a level of airflow that can be achieved with an inaudible fan. This tells you nothing of how your fans will perform in a given case, but it doesn't need to. No matter how easy or hard it is to get a certain amount of airflow in a given setup, the heatsink that requires the least airflow will always be the best choice for silent cooling.

I find that, for people who claim that SPCR's heatsink methodology is improper or not universally applicable, their real issue is that they want heatsinks ranked on their absolute heat dissipating potential. For such testing, case configuration and fan selection would all be very important as the goal would be to have a setup squeeze every last bit of performance out of the heatsink. Again, that is not what SPCR is testing for! They are testing for best low-airflow performance and as such fan/case optimization is irrelevant to the relative rankings.

Olle P
Posts: 711
Joined: Tue Nov 04, 2008 6:03 am
Location: Sweden

Post by Olle P » Wed Sep 02, 2009 1:46 am

MikeC wrote:
Olle P wrote:3. The other environmental factors.
- As been previously noted in this thread, the air pressure plays a part. ... Differences in air pressure depending on height above sea level and current weather makes for differences in the 10% region, and should be noted.
3. Humidity, eh? I don't think it varies that much around here.... but I suppose we could measure and note it.
As pointed out: (Absolute) air pressure has a significant and predictable influence on the result. The deltaT(p, T) for a given cooler should (theoretically) be proportionate to p/T, where p is absolute pressure and T is the absolute intake/room temperature (in kelvin).

T can be expected to stay within a +-3% margin off 290K (17-37C), while p may vary more than +-20% from your local "normal pressure".

What does this mean in practice?
Assume you test two coolers on different days. When you test cooler #1 it's it's a sunny day with high air pressure, but when you test cooler #2 it's heavy clouds and thunderstorms are expected, the air pressure is very low.
Both tests result in the same deltaT and you draw the conclusion that they're equally efficient.
Readers object to the test results, and for some reason you decide to test them again, this time side by side. Now the weather is a bit unstable, with average air pressure. The result this time is that deltaT for #1 drops 20% from the previous test while deltaT for #2 increases by a similar amount. They're obviously not equally efficient!

Cheers
Olle

Cistron
Posts: 618
Joined: Fri Mar 14, 2008 5:18 am
Location: London, UK

Post by Cistron » Wed Sep 02, 2009 5:16 am

Olle P wrote:while p may vary more than +-20% from your local "normal pressure".
Let us assume 'normal' pressure to be roughly 1bar (1013mbar is part of the 'normal conditions' for many physical chemical calculations, if I remember correctly). From my childhood I remember that on rainy days the pressure (in good old Austria) dropped to ~980-990mbar, whilst during the sunny days we had 1030-1040mbar. Which leads to a difference of ~50mbar between the extremes. 50/1000 = 0.05, aka 5% or ±2.5%.

juamez
Posts: 45
Joined: Wed Feb 06, 2008 2:17 pm
Location: Belgium

Post by juamez » Wed Sep 02, 2009 5:43 am

I'm sorry to see that my (quite bold) statements are countered with a sense of hostility.

Maybe I'm not putting my thoughts into words decently.

I have the idea that all heatsinks are tested with:
1) no ambient (relative to the cpu-heatsink) airflow impedance
2) 20°C ambient temperature
3) horizontal placement of the motherboard

So any results gathered from a testbed in those conditions cannot be blindly extrapolated to other situations, such as:
1) in a cramped case
2) that is warmer than 20°C
3) and has a different influence of gravity on it

Since we cannot do it blindly, we may as well use logic:
1) Fans do not dig airflow impedance and generate more turbulence and thus more noise when the impedance rises. Since some heatsinks are not exactly alike, maybe one type of them may cope with such turbulence in a more positive way than the other. I'm not sure, I'm not stating facts, I'm just giving the "what if".
2) Since most coolers (if not all of them) use heatpipes, behavior may change when the ambient temperature changes. Again, I do not now if that is the case, I'm just preparing for the worst. (heatpipe dynamics gone mad, or something like that)
3) I've read somewhere that old heatpipes relied on gravity to work, but the recent heatpipes perform best when they are placed horizontally. While this is again something "I've heard of", it may alter the results.

I'm not bickering about your reviews, MikeC. I fully respect your work, and I very much like your reviews, and you infected me with the silent-computering bug.

These posts are just something I was pondering on, from a purely scientific and strict point of view.

MikeC
Site Admin
Posts: 12285
Joined: Sun Aug 11, 2002 3:26 pm
Location: Vancouver, BC, Canada
Contact:

Post by MikeC » Wed Sep 02, 2009 6:54 am

juamez wrote:So any results gathered from a testbed in those conditions cannot be blindly extrapolated to other situations, such as:
And you expect to be able to do this? Such a science lover wanting "blind extrapolation"?! :roll:
1) in a cramped case
2) that is warmer than 20°C
3) and has a different influence of gravity on it.
1) and what testbed would give good predictions for such conditions? How useful would that be for most users? Not!
2) ambient temp has virtually no impact on temp rise or C/W. Very high ambient can make other components on the test board less stable, such as NB or VRMs, but this is outside the range of any CPU heatsink testing.
3) already mentioned: heatpipe orientation does not matter.

jessekopelman hit the bullseye on what SPCR heatsink reviews are about:
SPCR tests for a particular application -- low airflow cooling. What SPCR ranks as the best heatsink is the one that can dissipate the most heat at a level of airflow that can be achieved with an inaudible fan. This tells you nothing of how your fans will perform in a given case, but it doesn't need to. No matter how easy or hard it is to get a certain amount of airflow in a given setup, the heatsink that requires the least airflow will always be the best choice for silent cooling.

I find that, for people who claim that SPCR's heatsink methodology is improper or not universally applicable, their real issue is that they want heatsinks ranked on their absolute heat dissipating potential. For such testing, case configuration and fan selection would all be very important as the goal would be to have a setup squeeze every last bit of performance out of the heatsink. Again, that is not what SPCR is testing for! They are testing for best low-airflow performance and as such fan/case optimization is irrelevant to the relative rankings.
If you want something else, don't look here.

juamez
Posts: 45
Joined: Wed Feb 06, 2008 2:17 pm
Location: Belgium

Post by juamez » Thu Sep 03, 2009 8:57 am

MikeC wrote:
juamez wrote:So any results gathered from a testbed in those conditions cannot be blindly extrapolated to other situations, such as:
And you expect to be able to do this? Such a science lover wanting "blind extrapolation"?! :roll:
Oh come on, I don't like the big "rolleyes" attitude. :(

And no, I don't want to do "blind extrapolation", but since you test your heatsinks in conditions that most of us don't use, you leave us no other choice to do just that, IF we lack the knowledge of what you just explained further on in your popst.
MikeC wrote:
1) in a cramped case
2) that is warmer than 20°C
3) and has a different influence of gravity on it.
1) and what testbed would give good predictions for such conditions? How useful would that be for most users? Not!
2) ambient temp has virtually no impact on temp rise or C/W. Very high ambient can make other components on the test board less stable, such as NB or VRMs, but this is outside the range of any CPU heatsink testing.
3) already mentioned: heatpipe orientation does not matter.
1) this would imply less fresh air that is able to reach the fan's intake stream quietly, and I don't know what the impact might be on the tested fans
Since you answer my questions in the other two parts, I reckon your testing conditions really do represent the conditions under which most of us will use our hardware. Thank you, that is what I wanted to know. :)
jessekopelman hit the bullseye on what SPCR heatsink reviews are about:
SPCR tests for a particular application -- low airflow cooling. What SPCR ranks as the best heatsink is the one that can dissipate the most heat at a level of airflow that can be achieved with an inaudible fan. This tells you nothing of how your fans will perform in a given case, but it doesn't need to. No matter how easy or hard it is to get a certain amount of airflow in a given setup, the heatsink that requires the least airflow will always be the best choice for silent cooling.

I find that, for people who claim that SPCR's heatsink methodology is improper or not universally applicable, their real issue is that they want heatsinks ranked on their absolute heat dissipating potential. For such testing, case configuration and fan selection would all be very important as the goal would be to have a setup squeeze every last bit of performance out of the heatsink. Again, that is not what SPCR is testing for! They are testing for best low-airflow performance and as such fan/case optimization is irrelevant to the relative rankings.
If you want something else, don't look here.
That's not really what I meant.

To be honest, I made some bold and hypothetical statements, which wouldn't have caused such a stir if I had posted them in the form of a couple of questions, which afterall ... they just are. I'm sorry to have upset you. I meant no offense. I'm just not always fully able to express my train of thoughts in understandable english.

Olle P
Posts: 711
Joined: Tue Nov 04, 2008 6:03 am
Location: Sweden

Post by Olle P » Thu Sep 10, 2009 12:22 am

Cistron wrote:(1013mbar is part of the 'normal conditions' for many physical chemical calculations, if I remember correctly).
That's supposed to be the global average air pressure at sea level, yes.
Once you get above the sea level the pressure drops a bit, like 0.1mbar/m IIRC.
Cistron wrote:... on rainy days the pressure ... dropped to ~980-990mbar, whilst during the sunny days we had 1030-1040mbar. Which leads to a difference of ~50mbar between the extremes. 50/1000 = 0.05, aka 5% or ±2.5%.
In other locations you have more variation between the extremes. Still, my initial estimate (based on vague recollections) was over the top and should be reduced down to +-5%.

Cheers
Olle

Jordan
Posts: 557
Joined: Wed Apr 28, 2004 8:21 pm
Location: Scotland, UK

Post by Jordan » Thu May 20, 2010 3:14 pm

I could have sworn the heatsink testing took place inside an ATX case, or was this something that was stopped a long time ago? Actually, I think it's the PSU test I'm thinking of... :lol:

Something I always missed (while I believed this is how testing was done) was in addition to the 12v/7v/5v test was a test with no fan directly strapped to the HS and only the rear case fan in the test system spinning. I've been running like this for years now as I'm sure many others have. There's got to be a even less airflow over the HS with a 5v rear fan than with a 5v fan directly attached.

I sometimes wonder with certain heatsinks, especially those with tighter packed fins, if the performance will drop of drastically without that direct airflow (even if only 5v).

MikeC
Site Admin
Posts: 12285
Joined: Sun Aug 11, 2002 3:26 pm
Location: Vancouver, BC, Canada
Contact:

Post by MikeC » Thu May 20, 2010 4:01 pm

Hi Jordan,

I suspect you got here by way of an old link in the recommended HS article that should have been updated. This is the discussion for the previous HS test system; the latest HS test system is here: http://www.silentpcreview.com/2010_CPU_ ... t_Platform

Jordan
Posts: 557
Joined: Wed Apr 28, 2004 8:21 pm
Location: Scotland, UK

Post by Jordan » Fri May 21, 2010 1:55 am

Ah, I did indeed. Thank you Mike!

Post Reply