It is currently Wed Oct 01, 2014 3:32 am

All times are UTC - 8 hours




Post new topic Reply to topic  [ 18 posts ] 
Author Message
 Post subject: Updated VGA Card/Cooler Test Platform
PostPosted: Thu Dec 20, 2007 1:30 am 
Offline
Site Admin

Joined: Sun Aug 11, 2002 3:26 pm
Posts: 11874
Location: Vancouver, BC, Canada
Updated VGA Card/Cooler Test Platform

_________________
Mike Chin,
Editor/Publisher, SPCR
Support SPCR with your donations!


Top
 Profile  
 
 Post subject:
PostPosted: Thu Dec 20, 2007 3:20 am 
Offline
*Lifetime Patron*

Joined: Sat Jan 18, 2003 2:19 pm
Posts: 5316
Location: St Louis (county) Missouri USA
I'll admit to not having much experience with hot video cards. But please explain how the CPU in your new system, with it's fan at a fixed voltage, can remain at the same temperature with the ATI cooler blowing heat directly out of the case through it's own duct, as opposed to both Zalman coolers which just blow much of the heat around inside the case?

You would think while the ATI doesn't cool the video card as well, the CPU temperature would benefit. Am I missing something here?

_________________
"At the core of liberalism is the spoiled child - miserable, as all spoiled children are, unsatisfied, demanding, ill disciplined, despotic, and useless. Liberalism is the philosophy of sniveling brats." - P.J. O'Rourke


Top
 Profile  
 
 Post subject:
PostPosted: Thu Dec 20, 2007 7:29 am 
Offline
*Lifetime Patron*

Joined: Wed Jan 12, 2005 10:47 am
Posts: 1507
Location: Bucharest, Romania
I found one reference to 1,200 rpm Slipstream and two to 800 rpm, I mean the fan swapped in the S12-600.
Which one is it? BTW, I wouldn't dare to use a 800 rpm Slipstream at 5V in my S12E+650.
But I am really considering a fan swap with the 1,200 rpm one, and have it temperature controlled.
Does the Slipstream fan start at 3.8V, which IIRC is the starting voltage of S12E+650?


Top
 Profile  
 
 Post subject:
PostPosted: Thu Dec 20, 2007 8:05 am 
Offline
Site Admin

Joined: Sun Aug 11, 2002 3:26 pm
Posts: 11874
Location: Vancouver, BC, Canada
Bluefront wrote:
I'll admit to not having much experience with hot video cards. But please explain how the CPU in your new system, with it's fan at a fixed voltage, can remain at the same temperature with the ATI cooler blowing heat directly out of the case through it's own duct, as opposed to both Zalman coolers which just blow much of the heat around inside the case?

You would think while the ATI doesn't cool the video card as well, the CPU temperature would benefit. Am I missing something here?

We're not misreporting the results, in case that's what you're thinking. The simple answer: Your reasoning that the CPU must get cooler with the ATI cooler blowing air out is not supported by the results. If we must explain the results, I'd point to the fact that the CPU cooler benefits from a push-pull setup with 120mm fans on either side. They direct the air through the one exhaust from all the many open paths for cool intake airflow. Those factors appear to be enough to overcome whatever heat the Zalman fans may blow around in the case.

Note that even in the old VGA test bed, when ATITool only was applied to the X1950XTX with the Zalman 1000, the CPU temp stayed steady at 43C regardless of the cooler's fan speed or GPU temp.

Another point: When the CPU and GPU are stressed simultaneously as it was here, although the total power drawn by the system goes up by almost 10% (compared to when just the GPU is stressed), the GPU appears to be getting a bit less stressed, and the CPU a bit more stressed. ie, this suggests the contribution of the GPU to the heat in the system is smaller, hence, its hot air "wash" less significant to the CPU.

_________________
Mike Chin,
Editor/Publisher, SPCR
Support SPCR with your donations!


Top
 Profile  
 
 Post subject:
PostPosted: Thu Dec 20, 2007 8:32 am 
Offline
Site Admin

Joined: Sun Aug 11, 2002 3:26 pm
Posts: 11874
Location: Vancouver, BC, Canada
Tzupy wrote:
I found one reference to 1,200 rpm Slipstream and two to 800 rpm, I mean the fan swapped in the S12-600.
Which one is it? BTW, I wouldn't dare to use a 800 rpm Slipstream at 5V in my S12E+650.
But I am really considering a fan swap with the 1,200 rpm one, and have it temperature controlled.
Does the Slipstream fan start at 3.8V, which IIRC is the starting voltage of S12E+650?

Sorry, fixed the error. It's an 800, running at 5V -- hardwired. Barely starts & spins at just 400rpm. As reported in the article, the PSU exhaust air feels only a touch warm even at the end of long test sessions, but it's possible we'll end up injuring/killing the PSU like we did the NeoHE. We'll see.

As for your fan swap, you're safe with a 1200 -- our samples start at ~2.8V -- at around 500rpm. By ~3.8V, the speed is at ~600rpm.

_________________
Mike Chin,
Editor/Publisher, SPCR
Support SPCR with your donations!


Last edited by MikeC on Thu Dec 20, 2007 10:01 am, edited 1 time in total.

Top
 Profile  
 
 Post subject:
PostPosted: Thu Dec 20, 2007 9:47 am 
Offline
*Lifetime Patron*

Joined: Wed Jan 12, 2005 10:47 am
Posts: 1507
Location: Bucharest, Romania
Good to know that the Slipstream starts so low, thanks MikeC!
They are still not available in my country. :evil: <- at the Scythe importer!


Top
 Profile  
 
 Post subject:
PostPosted: Thu Dec 20, 2007 7:24 pm 
Offline

Joined: Sun Oct 09, 2005 7:53 pm
Posts: 81
I hadn't met those Slip Stream fans before. They look really good according to the manufacturer's site. How are they in reality? Do I want to go out and replace my Nexus?


Top
 Profile  
 
 Post subject:
PostPosted: Thu Dec 20, 2007 10:18 pm 
Offline
Site Admin

Joined: Sun Aug 11, 2002 3:26 pm
Posts: 11874
Location: Vancouver, BC, Canada
Filias Cupio wrote:
I hadn't met those Slip Stream fans before. They look really good according to the manufacturer's site. How are they in reality? Do I want to go out and replace my Nexus?

If you're happy with your Nexus fans, don't worry; the difference is not night & day. But if you're looking for new or more quiet fans, the Slip Streams are the ones to try. Almost no tonality.

_________________
Mike Chin,
Editor/Publisher, SPCR
Support SPCR with your donations!


Top
 Profile  
 
 Post subject:
PostPosted: Thu Dec 20, 2007 11:45 pm 
Offline

Joined: Mon Feb 28, 2005 4:21 pm
Posts: 2764
Location: NEW YORK WORD AND STUFF YEAH OK
i have to say that your video card choice is 2 years old.

8800GT 512
3870 512

Or, 8800GTS 640 if you want to go sort-of older but new generation.

x1900 is lying on a table near me dismantled.

even a 8800GTX 768 would be an interesting standard. It would be like your inefficient and not so silent pentium D chip.

The x1900 series cards are not even produced anymore (to my knowledge)

side note: about the slipstreams.... I see 4 speeds listed. any of them that people should stay away from? I remember Papst had various 120mm's that were noisy/high powered and 2 that you could quiet easily undervolting.


Top
 Profile  
 
 Post subject:
PostPosted: Fri Dec 21, 2007 12:06 am 
Offline
SPCR Reviewer

Joined: Fri Mar 04, 2005 9:07 pm
Posts: 1029
Location: Vancouver
~El~Jefe~ wrote:
i have to say that your video card choice is 2 years old.
8800GT 512
3870 512

Or, 8800GTS 640 if you want to go sort-of older but new generation.

x1900 is lying on a table near me dismantled.

even a 8800GTX 768 would be an interesting standard. It would be like your inefficient and not so silent pentium D chip.


That's sort of the point. The X1950XTX is an old, extremely inefficient card - from the power consumption figures I've read, the only cards that are more power hungry are the HD2900XT, 8800GTX/Ultra, and 7950 GX2. None of these cards are more compatible with aftermarket coolers than the X1950XTX though. In this case old technology is the best choice.

Power inefficiency is the same reason a Pentium D sits on our CPU heatsink test platform. We could use a Core 2 Quad, but none of them have a higher TDP, so there's really no point.

Also, new hardware is expensive... if anyone wants to donate some $400+ hardware, please speak up. :lol:


Top
 Profile  
 
 Post subject:
PostPosted: Fri Dec 21, 2007 2:48 am 
Offline
*Lifetime Patron*

Joined: Sat Jan 18, 2003 2:19 pm
Posts: 5316
Location: St Louis (county) Missouri USA
MikeC......thanks for the explanation of the CPU/GPU heat thing. I suppose this is another of those "you gotta be there" things. On paper one would suspect removing GPU heat before it has a chance to affect anything else....would be superior.

But if temperatures of everything else remain constant.....the choice of a GPU cooler matters only for the GPU itself, because nothing else is measureably affected. At least in your particular setup.....and this makes your new test rig ideal for it's purpose.

_________________
"At the core of liberalism is the spoiled child - miserable, as all spoiled children are, unsatisfied, demanding, ill disciplined, despotic, and useless. Liberalism is the philosophy of sniveling brats." - P.J. O'Rourke


Top
 Profile  
 
 Post subject:
PostPosted: Fri Dec 21, 2007 3:12 am 
Offline

Joined: Mon Jan 13, 2003 9:20 am
Posts: 747
Location: UK
MikeC wrote:
We're not misreporting the results, in case that's what you're thinking. The simple answer: Your reasoning that the CPU must get cooler with the ATI cooler blowing air out is not supported by the results. If we must explain the results, I'd point to the fact that the CPU cooler benefits from a push-pull setup with 120mm fans on either side. They direct the air through the one exhaust from all the many open paths for cool intake airflow. Those factors appear to be enough to overcome whatever heat the Zalman fans may blow around in the case.


I think that's they key for you, the push-pull set up. In my own experience with a HR-03 the CPU temperature did go up. Another thing to keep in mind is that ATITool uses some CPU power, reducing the effectiveness of CPUBurn. Since my system is quad core that effect was lessened compared to your single core.

However, as you say, the effect appears to be fairly minimal if set up right.

_________________
http://world3.net


Top
 Profile  
 
 Post subject:
PostPosted: Fri Dec 21, 2007 9:17 am 
Offline

Joined: Mon Feb 28, 2005 4:21 pm
Posts: 2764
Location: NEW YORK WORD AND STUFF YEAH OK
Yes, that is true that it is the most inefficient.

I am concerned though that the SAFETY of the cards in terms of VRM and ram heat dissipation. x1900xtx on passive with one setup wouldnt mean its safe with modern setups.

i know it's a torture test sort of thing, but I am not sure that it represents current things enough.


Top
 Profile  
 
 Post subject:
PostPosted: Fri Dec 21, 2007 11:29 am 
Offline

Joined: Mon Sep 10, 2007 1:05 pm
Posts: 759
Location: Colorado, USA
I look forward to the S-1 and slipstream reviews that are coming soon! This is good stuff. I agree, keeping the pentium D and 1950 are good moves, they really test the thermals. Set the bar high, and when things come under the bar, that much for the better.

_________________
Gaming HTPC: Antec NSK-2480/ Antec EW430 Bronze/ i5-2400/ MSI H67/ Ninja-Mini/ 4GB DDR3/ 500GB WD Sata 3.0/ XFX HD6850/ Windows 7 x64/ Toshiba 46" 1080p LED/LCD TV


Top
 Profile  
 
 Post subject:
PostPosted: Fri Dec 21, 2007 5:08 pm 
Offline

Joined: Mon Jan 13, 2003 9:20 am
Posts: 747
Location: UK
~El~Jefe~ wrote:
I am concerned though that the SAFETY of the cards in terms of VRM and ram heat dissipation. x1900xtx on passive with one setup wouldnt mean its safe with modern setups.


That reminds me, I discovered that a lot of the heat produced by the 8800 Ultra is from the RAM. In fact, at idle in 2D mode it accounts for the majority of the heat. Underclocking the core makes only 1C difference to temps, underclocking RAM makes a 10-12C difference.

That got me thinking about various coolers. It's a shame there is no way to really measure RAM temperature, other than seeing how far you can overclock. I think the reason a lot of coolers are not compatible with the 8800 is insufficient RAM cooling.

Hopefully we will see a move to more efficient DDR4 RAM and much more dynamic underclocking in 2D mode. In fact, for 2D mode there is no reason why most of it can't be switched off - even on Vista 128MB is overkill and 8800s often come with 768MB.

_________________
http://world3.net


Top
 Profile  
 
 Post subject:
PostPosted: Fri Dec 21, 2007 9:28 pm 
Offline

Joined: Mon Feb 28, 2005 4:21 pm
Posts: 2764
Location: NEW YORK WORD AND STUFF YEAH OK
that's really crappy. im glad then that i have gddr4 memory on my 3870

thats a lot of heat your are talking about. that cant be good for longevity


Top
 Profile  
 
 Post subject:
PostPosted: Thu Jan 03, 2008 9:13 am 
Offline

Joined: Thu Aug 16, 2007 5:57 pm
Posts: 5
MoJo wrote:
~El~Jefe~ wrote:

That reminds me, I discovered that a lot of the heat produced by the 8800 Ultra is from the RAM. In fact, at idle in 2D mode it accounts for the majority of the heat. Underclocking the core makes only 1C difference to temps, underclocking RAM makes a 10-12C difference.

That got me thinking about various coolers. It's a shame there is no way to really measure RAM temperature, other than seeing how far you can overclock. I think the reason a lot of coolers are not compatible with the 8800 is insufficient RAM cooling.



On the Palit 8800GT, temps are 55C idle, 90C load (games) w/ fan on auto - 29%. Fan is very quiet.

I use riva tuner to adjust fan speed as core temps go up.

I, too, tried to underclock the GPU to test for temp changes. It only made a 1C difference. I could also hear some kind of processing 'whine'. very faint, too. That subtle noise wasn't worth the 1C savings.

So, does underclocking a GPU have any benefit whatsoever?

_________________
Intel E6750 | Gigabyte DS3R F9, Antec Sonata 3 500W | OCZ XTC Rev 2 2GB | Palit 8800GT 512MB (O/C to 675/1000), 226BW

AMD Athlon 64 3000+ |Asus K8N-E Deluxe | OCZ 1GB | Antec Sonata 1 | 930B


Top
 Profile  
 
 Post subject:
PostPosted: Thu Jan 03, 2008 10:59 am 
Offline

Joined: Mon Jan 13, 2003 9:20 am
Posts: 747
Location: UK
Underclocking the GPU is basically pointless on 8800 cards, as the power savings are tiny and I have noticed that it does tend to introduce errors. For example, when browsing web pages and scrolling a lot, occasionally a few pixels of text will be missing with the GPU underclocked. Minor, but annoying, especially if you like Photoshop :)

What does help is underclocking the RAM. Sadly the built in nVidia overclocking feature does not let you do this in a sensible way. You can set separate 2D/3D GPU clocks, but the memory clock is always the same for both 2D and 3D. ATITool can do it but it's detection of 3D mode simply does not work for me, I have to give it the executable of every app I want detected. Plus, you get the same occasional pixel errors and forgetting to put your RAM back up to normal speed before running a game usually crashes it.

However, underclocking the RAM to 200MHz (lower and it freezes) saved me 10-12C at least. Shame nVidia does not support it properly.

In the end you just have to accept that your 8800 is destroying the planet and your wallet :(

_________________
http://world3.net


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 18 posts ] 

All times are UTC - 8 hours


Who is online

Users browsing this forum: Nicias and 2 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group