Updated VGA Card/Cooler Test Platform

Want to talk about one of the articles in SPCR? Here's the forum for you.
Post Reply
MikeC
Site Admin
Posts: 12285
Joined: Sun Aug 11, 2002 3:26 pm
Location: Vancouver, BC, Canada
Contact:

Updated VGA Card/Cooler Test Platform

Post by MikeC » Thu Dec 20, 2007 1:30 am


Bluefront
*Lifetime Patron*
Posts: 5316
Joined: Sat Jan 18, 2003 2:19 pm
Location: St Louis (county) Missouri USA

Post by Bluefront » Thu Dec 20, 2007 3:20 am

I'll admit to not having much experience with hot video cards. But please explain how the CPU in your new system, with it's fan at a fixed voltage, can remain at the same temperature with the ATI cooler blowing heat directly out of the case through it's own duct, as opposed to both Zalman coolers which just blow much of the heat around inside the case?

You would think while the ATI doesn't cool the video card as well, the CPU temperature would benefit. Am I missing something here?

Tzupy
*Lifetime Patron*
Posts: 1561
Joined: Wed Jan 12, 2005 10:47 am
Location: Bucharest, Romania

Post by Tzupy » Thu Dec 20, 2007 7:29 am

I found one reference to 1,200 rpm Slipstream and two to 800 rpm, I mean the fan swapped in the S12-600.
Which one is it? BTW, I wouldn't dare to use a 800 rpm Slipstream at 5V in my S12E+650.
But I am really considering a fan swap with the 1,200 rpm one, and have it temperature controlled.
Does the Slipstream fan start at 3.8V, which IIRC is the starting voltage of S12E+650?

MikeC
Site Admin
Posts: 12285
Joined: Sun Aug 11, 2002 3:26 pm
Location: Vancouver, BC, Canada
Contact:

Post by MikeC » Thu Dec 20, 2007 8:05 am

Bluefront wrote:I'll admit to not having much experience with hot video cards. But please explain how the CPU in your new system, with it's fan at a fixed voltage, can remain at the same temperature with the ATI cooler blowing heat directly out of the case through it's own duct, as opposed to both Zalman coolers which just blow much of the heat around inside the case?

You would think while the ATI doesn't cool the video card as well, the CPU temperature would benefit. Am I missing something here?
We're not misreporting the results, in case that's what you're thinking. The simple answer: Your reasoning that the CPU must get cooler with the ATI cooler blowing air out is not supported by the results. If we must explain the results, I'd point to the fact that the CPU cooler benefits from a push-pull setup with 120mm fans on either side. They direct the air through the one exhaust from all the many open paths for cool intake airflow. Those factors appear to be enough to overcome whatever heat the Zalman fans may blow around in the case.

Note that even in the old VGA test bed, when ATITool only was applied to the X1950XTX with the Zalman 1000, the CPU temp stayed steady at 43C regardless of the cooler's fan speed or GPU temp.

Another point: When the CPU and GPU are stressed simultaneously as it was here, although the total power drawn by the system goes up by almost 10% (compared to when just the GPU is stressed), the GPU appears to be getting a bit less stressed, and the CPU a bit more stressed. ie, this suggests the contribution of the GPU to the heat in the system is smaller, hence, its hot air "wash" less significant to the CPU.

MikeC
Site Admin
Posts: 12285
Joined: Sun Aug 11, 2002 3:26 pm
Location: Vancouver, BC, Canada
Contact:

Post by MikeC » Thu Dec 20, 2007 8:32 am

Tzupy wrote:I found one reference to 1,200 rpm Slipstream and two to 800 rpm, I mean the fan swapped in the S12-600.
Which one is it? BTW, I wouldn't dare to use a 800 rpm Slipstream at 5V in my S12E+650.
But I am really considering a fan swap with the 1,200 rpm one, and have it temperature controlled.
Does the Slipstream fan start at 3.8V, which IIRC is the starting voltage of S12E+650?
Sorry, fixed the error. It's an 800, running at 5V -- hardwired. Barely starts & spins at just 400rpm. As reported in the article, the PSU exhaust air feels only a touch warm even at the end of long test sessions, but it's possible we'll end up injuring/killing the PSU like we did the NeoHE. We'll see.

As for your fan swap, you're safe with a 1200 -- our samples start at ~2.8V -- at around 500rpm. By ~3.8V, the speed is at ~600rpm.
Last edited by MikeC on Thu Dec 20, 2007 10:01 am, edited 1 time in total.

Tzupy
*Lifetime Patron*
Posts: 1561
Joined: Wed Jan 12, 2005 10:47 am
Location: Bucharest, Romania

Post by Tzupy » Thu Dec 20, 2007 9:47 am

Good to know that the Slipstream starts so low, thanks MikeC!
They are still not available in my country. :evil: <- at the Scythe importer!

Filias Cupio
Posts: 81
Joined: Sun Oct 09, 2005 7:53 pm

Post by Filias Cupio » Thu Dec 20, 2007 7:24 pm

I hadn't met those Slip Stream fans before. They look really good according to the manufacturer's site. How are they in reality? Do I want to go out and replace my Nexus?

MikeC
Site Admin
Posts: 12285
Joined: Sun Aug 11, 2002 3:26 pm
Location: Vancouver, BC, Canada
Contact:

Post by MikeC » Thu Dec 20, 2007 10:18 pm

Filias Cupio wrote:I hadn't met those Slip Stream fans before. They look really good according to the manufacturer's site. How are they in reality? Do I want to go out and replace my Nexus?
If you're happy with your Nexus fans, don't worry; the difference is not night & day. But if you're looking for new or more quiet fans, the Slip Streams are the ones to try. Almost no tonality.

~El~Jefe~
Friend of SPCR
Posts: 2887
Joined: Mon Feb 28, 2005 4:21 pm
Location: New York City zzzz
Contact:

Post by ~El~Jefe~ » Thu Dec 20, 2007 11:45 pm

i have to say that your video card choice is 2 years old.

8800GT 512
3870 512

Or, 8800GTS 640 if you want to go sort-of older but new generation.

x1900 is lying on a table near me dismantled.

even a 8800GTX 768 would be an interesting standard. It would be like your inefficient and not so silent pentium D chip.

The x1900 series cards are not even produced anymore (to my knowledge)

side note: about the slipstreams.... I see 4 speeds listed. any of them that people should stay away from? I remember Papst had various 120mm's that were noisy/high powered and 2 that you could quiet easily undervolting.

Lawrence Lee
SPCR Reviewer
Posts: 1115
Joined: Fri Mar 04, 2005 9:07 pm
Location: Vancouver

Post by Lawrence Lee » Fri Dec 21, 2007 12:06 am

~El~Jefe~ wrote:i have to say that your video card choice is 2 years old.
8800GT 512
3870 512

Or, 8800GTS 640 if you want to go sort-of older but new generation.

x1900 is lying on a table near me dismantled.

even a 8800GTX 768 would be an interesting standard. It would be like your inefficient and not so silent pentium D chip.
That's sort of the point. The X1950XTX is an old, extremely inefficient card - from the power consumption figures I've read, the only cards that are more power hungry are the HD2900XT, 8800GTX/Ultra, and 7950 GX2. None of these cards are more compatible with aftermarket coolers than the X1950XTX though. In this case old technology is the best choice.

Power inefficiency is the same reason a Pentium D sits on our CPU heatsink test platform. We could use a Core 2 Quad, but none of them have a higher TDP, so there's really no point.

Also, new hardware is expensive... if anyone wants to donate some $400+ hardware, please speak up. :lol:

Bluefront
*Lifetime Patron*
Posts: 5316
Joined: Sat Jan 18, 2003 2:19 pm
Location: St Louis (county) Missouri USA

Post by Bluefront » Fri Dec 21, 2007 2:48 am

MikeC......thanks for the explanation of the CPU/GPU heat thing. I suppose this is another of those "you gotta be there" things. On paper one would suspect removing GPU heat before it has a chance to affect anything else....would be superior.

But if temperatures of everything else remain constant.....the choice of a GPU cooler matters only for the GPU itself, because nothing else is measureably affected. At least in your particular setup.....and this makes your new test rig ideal for it's purpose.

MoJo
Posts: 773
Joined: Mon Jan 13, 2003 9:20 am
Location: UK

Post by MoJo » Fri Dec 21, 2007 3:12 am

MikeC wrote: We're not misreporting the results, in case that's what you're thinking. The simple answer: Your reasoning that the CPU must get cooler with the ATI cooler blowing air out is not supported by the results. If we must explain the results, I'd point to the fact that the CPU cooler benefits from a push-pull setup with 120mm fans on either side. They direct the air through the one exhaust from all the many open paths for cool intake airflow. Those factors appear to be enough to overcome whatever heat the Zalman fans may blow around in the case.
I think that's they key for you, the push-pull set up. In my own experience with a HR-03 the CPU temperature did go up. Another thing to keep in mind is that ATITool uses some CPU power, reducing the effectiveness of CPUBurn. Since my system is quad core that effect was lessened compared to your single core.

However, as you say, the effect appears to be fairly minimal if set up right.

~El~Jefe~
Friend of SPCR
Posts: 2887
Joined: Mon Feb 28, 2005 4:21 pm
Location: New York City zzzz
Contact:

Post by ~El~Jefe~ » Fri Dec 21, 2007 9:17 am

Yes, that is true that it is the most inefficient.

I am concerned though that the SAFETY of the cards in terms of VRM and ram heat dissipation. x1900xtx on passive with one setup wouldnt mean its safe with modern setups.

i know it's a torture test sort of thing, but I am not sure that it represents current things enough.

djkest
Posts: 766
Joined: Mon Sep 10, 2007 1:05 pm
Location: Colorado, USA

Post by djkest » Fri Dec 21, 2007 11:29 am

I look forward to the S-1 and slipstream reviews that are coming soon! This is good stuff. I agree, keeping the pentium D and 1950 are good moves, they really test the thermals. Set the bar high, and when things come under the bar, that much for the better.

MoJo
Posts: 773
Joined: Mon Jan 13, 2003 9:20 am
Location: UK

Post by MoJo » Fri Dec 21, 2007 5:08 pm

~El~Jefe~ wrote:I am concerned though that the SAFETY of the cards in terms of VRM and ram heat dissipation. x1900xtx on passive with one setup wouldnt mean its safe with modern setups.
That reminds me, I discovered that a lot of the heat produced by the 8800 Ultra is from the RAM. In fact, at idle in 2D mode it accounts for the majority of the heat. Underclocking the core makes only 1C difference to temps, underclocking RAM makes a 10-12C difference.

That got me thinking about various coolers. It's a shame there is no way to really measure RAM temperature, other than seeing how far you can overclock. I think the reason a lot of coolers are not compatible with the 8800 is insufficient RAM cooling.

Hopefully we will see a move to more efficient DDR4 RAM and much more dynamic underclocking in 2D mode. In fact, for 2D mode there is no reason why most of it can't be switched off - even on Vista 128MB is overkill and 8800s often come with 768MB.

~El~Jefe~
Friend of SPCR
Posts: 2887
Joined: Mon Feb 28, 2005 4:21 pm
Location: New York City zzzz
Contact:

Post by ~El~Jefe~ » Fri Dec 21, 2007 9:28 pm

that's really crappy. im glad then that i have gddr4 memory on my 3870

thats a lot of heat your are talking about. that cant be good for longevity

chrispycrunch
Posts: 5
Joined: Thu Aug 16, 2007 5:57 pm

Post by chrispycrunch » Thu Jan 03, 2008 9:13 am

MoJo wrote:
~El~Jefe~ wrote:
That reminds me, I discovered that a lot of the heat produced by the 8800 Ultra is from the RAM. In fact, at idle in 2D mode it accounts for the majority of the heat. Underclocking the core makes only 1C difference to temps, underclocking RAM makes a 10-12C difference.

That got me thinking about various coolers. It's a shame there is no way to really measure RAM temperature, other than seeing how far you can overclock. I think the reason a lot of coolers are not compatible with the 8800 is insufficient RAM cooling.
On the Palit 8800GT, temps are 55C idle, 90C load (games) w/ fan on auto - 29%. Fan is very quiet.

I use riva tuner to adjust fan speed as core temps go up.

I, too, tried to underclock the GPU to test for temp changes. It only made a 1C difference. I could also hear some kind of processing 'whine'. very faint, too. That subtle noise wasn't worth the 1C savings.

So, does underclocking a GPU have any benefit whatsoever?

MoJo
Posts: 773
Joined: Mon Jan 13, 2003 9:20 am
Location: UK

Post by MoJo » Thu Jan 03, 2008 10:59 am

Underclocking the GPU is basically pointless on 8800 cards, as the power savings are tiny and I have noticed that it does tend to introduce errors. For example, when browsing web pages and scrolling a lot, occasionally a few pixels of text will be missing with the GPU underclocked. Minor, but annoying, especially if you like Photoshop :)

What does help is underclocking the RAM. Sadly the built in nVidia overclocking feature does not let you do this in a sensible way. You can set separate 2D/3D GPU clocks, but the memory clock is always the same for both 2D and 3D. ATITool can do it but it's detection of 3D mode simply does not work for me, I have to give it the executable of every app I want detected. Plus, you get the same occasional pixel errors and forgetting to put your RAM back up to normal speed before running a game usually crashes it.

However, underclocking the RAM to 200MHz (lower and it freezes) saved me 10-12C at least. Shame nVidia does not support it properly.

In the end you just have to accept that your 8800 is destroying the planet and your wallet :(

Post Reply