GF 6600GT AGP - My cooling adventures

They make noise, too.

Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee

SometimesWarrior
Patron of SPCR
Posts: 700
Joined: Thu Mar 13, 2003 2:38 pm
Location: California, US
Contact:

Post by SometimesWarrior » Tue Jan 18, 2005 1:36 am

Edward Ng wrote:Er I don't have any measurements on 6800; never even touched or used one before. I've got a 6800GT and a 6600GT, and the 6800GT is water cooled.

Wow, that 11nm is seriously leaky. :(
Wouldn't that extra current draw simply be manifested as heat? I mean, if a processor draws 50 watts, it's gonna generate 50 watts of heat. It's not doing any heavy lifting; those electrons are pretty easy to carry back and forth across the processor. :)

Xbitlabs says the 6800 core runs at 1.22V, whereas the 6600GT runs at ~1.48V. Also, the 6800 runs at 350MHz, the 6600GT at 500MHz. To balance that, the 6600GT has 8 pixel pipes and 3 vertex, versus the 6800's 12 and 6. So, let's do some algebra:

(1.48^2 / 1.22^2) * (500 / 350) * ( [8+3] / [12+6] ) = 1.28 times more power for 6600GT.

Now, we compare the load power of the 6600GT to the 6800, according to Xbitlabs' figures: 48.9W / 38.9W = 1.23. The closeness could be a coincidence, because I made a lot of nonsense assumptions in my calculation. But the two numbers are close!

So, faced with all this evidence that the 50W figure is accurate, how can Ed's fanless Aerocool VM-101 work fine for the GPU, when it's unlikely that a CPU anywhere near 50W would function with the same cooling setup? (...or would it?) The lower-speed Athlon 64's pull under 50W with CPUBurn, but I doubt a fanless videocard cooler would work on an A64, even if you could figure out how to mount it. Midrange graphics cards pulling 40-50W often come with something that more closely resembles a northbridge cooler than a CPU cooler. Maybe it's simply because GPU's are okay with higher temperatures than CPU's... or maybe there's some kind of "conspiracy of science" going on here! :shock: :lol:
Last edited by SometimesWarrior on Tue Jan 18, 2005 10:22 am, edited 1 time in total.

meglamaniac
Posts: 380
Joined: Thu Jul 15, 2004 12:44 pm
Location: UK

Post by meglamaniac » Tue Jan 18, 2005 1:59 am

I agree with your power draw sums, however:
SometimesWarrior wrote:Wouldn't that extra current draw simply be manifested as heat? I mean, if a processor draws 50 watts, it's gonna generate 50 watts of heat.
This isn't true.
Heat is usually generated by two processes in microchips.
Firstly, there is good old fasioned friction. Those transistor gates opening and shutting so fast builds up a fair bit of heat - they may be small but there's a lot of them.
Secondly, electron leakage. To some extent this is caused by the first, as it gets worse the hotter the chip is. Electrons "jump" circuits to where they should not be, and one thing you learn in physics is that any electron jump is always from a high potential plane to a lower one - which means a net loss of energy. Where does that energy go? Yep, you guessed it.
This is also the reason for processor instability at high temperatures. There is a temperature range between stable operation and permanent damage to the structure of the silicon where enough electron leakage occurs to disrupt the calculations the processor makes, so the processor starts making errors and programs crash.

Anyway, if a processor is drawing 50W of power and giving of 50W of heat then you've not got a processor, you've got the world's most efficient electric heater!
Even electric heaters don't have 100% efficiency at converting electricity to heat, so if a processor was doing it...!

It's easy to get power draw and heat dissipation figures confused I guess. I would bet a week's wages the 6600GT doesn't give off anywhere near 50W of heat energy.

Edward Ng
SPCR Reviewer
Posts: 2696
Joined: Thu Dec 11, 2003 9:53 pm
Location: Scarsdale, NY
Contact:

Post by Edward Ng » Tue Jan 18, 2005 3:33 am

It's unfortunate that we can't get a precise measurement on how much of the consumed power is actually wasted by each of those chips in the form of heat. Well, it's a shame we can't get a truly accurate measure of this figure in the case of any chips, for the matter. Our only choice is to go with what the manufacturers give us and then try to guesstimate accurately... :roll:

-Ed

ChrisH
Friend of SPCR
Posts: 65
Joined: Wed Oct 30, 2002 10:44 am
Location: Charlotte, NC USA - Go Panthers!

Post by ChrisH » Tue Jan 18, 2005 9:40 am

Edward Ng wrote:Wow, that 11nm is seriously leaky. :(
I think you meant to say 110nm. I wasn't going to say anything, but you've used the wrong measurement several times.

SometimesWarrior
Patron of SPCR
Posts: 700
Joined: Thu Mar 13, 2003 2:38 pm
Location: California, US
Contact:

Post by SometimesWarrior » Tue Jan 18, 2005 11:00 am

meglamaniac wrote:Electrons "jump" circuits to where they should not be, and one thing you learn in physics is that any electron jump is always from a high potential plane to a lower one - which means a net loss of energy. Where does that energy go? Yep, you guessed it.
Thank you for correcting me before I managed to misinform too many people. :)

But when you say "Yep, you guessed it"... I still haven't guessed right. Where does that energy go? Light? Radiation? I'm assuming that whatever power that space heaters don't use to make heat gets converted to light. I'll fire off a question to my professor, too.

meglamaniac
Posts: 380
Joined: Thu Jul 15, 2004 12:44 pm
Location: UK

Post by meglamaniac » Tue Jan 18, 2005 11:05 am

Short answer: heat.

Long answer: It's radiated. What it's radiated as depends on the energy difference between the new plane and the old plane, and a few other factors that determine the wavelength of the radiation.
Usually it'll be heat or low spectrum (ie. red) light.

[edit]
Just thought I'd point out that in other cases (obviously not electronics!) the same process produces rather more dangerous results - such as x-rays or gamma radiation (but not alpha or beta - they are particle radiation where as gamma is e/m radiation).
Last edited by meglamaniac on Tue Jan 18, 2005 2:54 pm, edited 2 times in total.

nbac
Patron of SPCR
Posts: 142
Joined: Sat Jul 05, 2003 11:27 am
Location: Sweden

Post by nbac » Tue Jan 18, 2005 1:56 pm

The 50W power budget also include the rams. Probably 5 to 10W depending on the number of ram chips and clock rate.

Ian Brumby
Posts: 12
Joined: Wed Jan 05, 2005 10:12 pm
Location: Canberra, Australia

Post by Ian Brumby » Tue Jan 18, 2005 4:22 pm

Interesting calculations.

The 6800 runs at 325 MHz, not 350 MHz. Also, would it be better to compare transistor counts rather than pixel pipes?

6600 (NV43) - 143 or 146 million
6800 (NV40/NV41) - 220 or 222 million

Using these numbers the 6600 should require 47% more power!

Also, an AGP slot provides 42W and the 6600 GT requires an extra molex connection which indicates it needs more than 42W. PCI Express can provide 75W, and the 6600 GT (PCI-X) does not have an extra molex connection which indicates it needs less than 75W.

EDIT: I notice some 6800's have an extra power connector, and some don't. I guess this indicates it's borderline on 42W.

Also on Tom's today they talk about the new record power requirements of Intel's upcoming dual core processors and mention the "switch from a 90 nm to a 65 nm processor manufacturing process brings an increase in leakage current". Why is this? I thought it would improve power consumption stats.
Last edited by Ian Brumby on Tue Jan 18, 2005 4:52 pm, edited 1 time in total.

Edward Ng
SPCR Reviewer
Posts: 2696
Joined: Thu Dec 11, 2003 9:53 pm
Location: Scarsdale, NY
Contact:

Post by Edward Ng » Tue Jan 18, 2005 4:36 pm

ChrisH wrote:
Edward Ng wrote:Wow, that 11nm is seriously leaky. :(
I think you meant to say 110nm. I wasn't going to say anything, but you've used the wrong measurement several times.
You're right. I'm actually more used to .11micron, but can't type the mu symbol directly on my diNovo; didn't know the right conversion factor to nanometers.

meglamaniac
Posts: 380
Joined: Thu Jul 15, 2004 12:44 pm
Location: UK

Post by meglamaniac » Wed Jan 19, 2005 12:26 am

Ian Brumby wrote:Also on Tom's today they talk about the new record power requirements of Intel's upcoming dual core processors and mention the "switch from a 90 nm to a 65 nm processor manufacturing process brings an increase in leakage current". Why is this? I thought it would improve power consumption stats.
There was a lot said about electron leakage when the move to 90nm was first publicised. This goes back to what I was saying earlier. The basic physics of it is that the closer together you put the circuits, the less energy is required for an electron to "jump" so the more leakage there is. The ways to combat this include keeping the processor cooler (haha, yeah right Intel!), using a lower voltage, or coming up with a technology to make it harder for electrons leak.

Technically, shrinking the processor should help keep the temperatures down, but this is becomming harder and harder the smaller you make them as leakage becomes exponentially worse. Instead, IBM developed SOI fabrication which falls under the third option, and AMD have since licenced it and it is used in Athlon 64s, and may be in later AthlonXPs (aka. Sempron). I'm not sure what Intel's status on this is, although last I heard they had plans to use it.

spacey
Posts: 81
Joined: Mon Aug 11, 2003 10:31 am
Location: Ontario, Canada

Post by spacey » Wed Jan 19, 2005 2:17 pm

Edward Ng wrote:OMFG IS THAT VIDEO CARD SITTING RIGHT ON THE CARPET!?@#$%

NOOOOOOOOOOOOOOOOOOOOOOOOOOOOO!@#$%$!@#$%@#$%
hrm i've never thought about that. does it really hold a lot of charge? i've never shocked myself touching the carpet. but i will avoid putting parts on the ground in the future, its just dirty.

Ian Brumby
Posts: 12
Joined: Wed Jan 05, 2005 10:12 pm
Location: Canberra, Australia

Post by Ian Brumby » Thu Jan 20, 2005 12:03 am

I've just found this link.

Zalman stating the ZM-80 series does not work with the 6600 GT AGP.
Zalman stating that the VF700-CU does work with the 6600 GT AGP (at least on some cards)

They show a modded Albatron card.

http://www.zalman.co.kr/mboard/mboard/m ... er_da=desc

Larsp
Posts: 11
Joined: Wed Sep 22, 2004 2:33 pm

Post by Larsp » Tue Jan 25, 2005 4:33 am

meglamaniac wrote:
SometimesWarrior wrote:Wouldn't that extra current draw simply be manifested as heat? I mean, if a processor draws 50 watts, it's gonna generate 50 watts of heat.
This isn't true.
Are you sure about this, meglamaniac? According to the fundamental law of energy conservation, the energy that goes in, must come out again.

If the card takes 50 W and spends a neglible amount of power to communicate with the monitor and the motherboard, then the 50 W has nowhere else to go than heat/radiation. I don't know how much radio frequency radiation the card makes, but if it is much above 1 W, then I doubt the card would pass regulations (though I don't know). Conclusively, nearly 50 W would be dissapated as heat, no matter what. Right?
meglamaniac wrote:Even electric heaters don't have 100% efficiency at converting electricity to heat, so if a processor was doing it...!
Hmm... What do you mean? How could an electric heater have below 100% efficiency of converting electricity to heat? What other kind of energy should go out of the heater? Remember that radiation also converts to heat ultimately.

meglamaniac
Posts: 380
Joined: Thu Jul 15, 2004 12:44 pm
Location: UK

Post by meglamaniac » Tue Jan 25, 2005 5:08 am

I've already said in another thread where the same point was made, I don't know what I was thinking heh.
Ignore that post, it was a brain freeze day.

:)

Larsp
Posts: 11
Joined: Wed Sep 22, 2004 2:33 pm

Post by Larsp » Tue Jan 25, 2005 2:08 pm

meglamaniac wrote:Ignore that post, it was a brain freeze day
No problem! :D

Post Reply