power consumption: 2600 Pro/XT and 8600 GT/GTS

They make noise, too.

Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee

Post Reply
Juventas
Posts: 30
Joined: Wed Mar 07, 2007 10:25 pm
Location: Canada

power consumption: 2600 Pro/XT and 8600 GT/GTS

Post by Juventas » Sun Jul 15, 2007 9:35 pm

These cards are obviously marketed as direct competitors. They benchmark in the same ballpark for DX10, but I've been unable to find a direct comparison for power consumption. The best article I've found so far compares the 2600 XT to seemingly every card except its direct competitors, go figure.

http://www.techpowerup.com/reviews/ATI/HD_2600_XT/17

Update: Found another article with everything but the 2600 Pro. Generally it seems they're close enough that there's probably a larger difference between card manufacturers and ram types than there is between these GPUs. kater's hardspell article would seem to concure.

http://www.behardware.com/articles/675- ... -2400.html
Last edited by Juventas on Mon Jul 16, 2007 12:15 am, edited 1 time in total.

Aris
Posts: 2299
Joined: Mon Dec 15, 2003 10:29 am
Location: Bellevue, Nebraska
Contact:

Post by Aris » Sun Jul 15, 2007 10:05 pm

yeah, and it did it with total system power draw, so its completely useless data if you find an 8600 someplace else with that information. i hate when reviewers do that.


im not happy with the 8600's at all. their basically just DX10 7600's that draw 10 more watts. what a bunch of junk. hopefully nvidia turns it around and does it right on the next gen parts.

kater
Posts: 891
Joined: Thu Sep 07, 2006 11:20 pm
Location: Poland

Post by kater » Sun Jul 15, 2007 11:48 pm

According to Xbit Labs 8600GTS goes up to 47W under load and idles at 21W - not bad at all. Techpowerup says a system with 2600XT needs just as many watts as a system with X1950XTX. X1950XTX needs, acc. to Xbit, 33W in idle mode. Techpowerup says a system with 2600XT takes 7W less than a system with X1800GTO, and X1800GTO needs 48W at full load. So my calculations either don't add up (I wouldn't be surprise given my school record, especially with regard to algebra), or 2600XT is one strange card, idling at 33W and topping at some 41W. Here's where I take Xbit data from - linky. Also, check out this article on Hardspell - linky - for some reason the site opens once in 10 times and shows pics v unwillingly. Maybe you'll have more luck ;) I was only able to open the standby pwr cons graph.

Juventas
Posts: 30
Joined: Wed Mar 07, 2007 10:25 pm
Location: Canada

Post by Juventas » Mon Jul 16, 2007 12:06 am

Aris wrote:yeah, and it did it with total system power draw, so its completely useless data if you find an 8600 someplace else with that information. i hate when reviewers do that.
As long as the testbed is the same it's fine with me. It's actually more accurate comparing total system since the video cards may be offloading processing to the CPU differently, or using the system RAM in different ways.
Aris wrote:im not happy with the 8600's at all. their basically just DX10 7600's that draw 10 more watts. what a bunch of junk. hopefully nvidia turns it around and does it right on the next gen parts.
It's true if you're not interested in DX10 features, or HD video, there's little reason to bother with these cards. You get better bang per watt (and your buck) with most of the previous series.

Juventas
Posts: 30
Joined: Wed Mar 07, 2007 10:25 pm
Location: Canada

Post by Juventas » Mon Jul 16, 2007 12:22 am

kater wrote:or 2600XT is one strange card, idling at 33W and topping at some 41W.
I have noticed that the 2600's differ little between idle and load compared to other cards. Unfortunately, the idle figures aren't very low for it's class.
kater wrote: - for some reason the site opens once in 10 times and shows pics v unwillingly. Maybe you'll have more luck ;) I was only able to open the standby pwr cons graph.
Same here. But it is something. Also, the X-bit link doesn't go to a specific article for me.

jojo4u
Posts: 806
Joined: Sat Dec 14, 2002 7:00 am
Location: Germany

Post by jojo4u » Mon Jul 16, 2007 12:30 am

Here you are:
http://www.computerbase.de/artikel/hard ... gsaufnahme

Whole system, (3) = GDDR3, (4) = GDDR4. Voltage/frequency reduction is defective on the GDDR4 version.

kater
Posts: 891
Joined: Thu Sep 07, 2006 11:20 pm
Location: Poland

Post by kater » Mon Jul 16, 2007 12:33 am

Juventas wrote:Also, the X-bit link doesn't go to a specific article for me.
Yeah, it's because I used several of them - each gives you pwr consumption figures for various cards. Just click away and look for sections marked as "power consumption" etc.

So it seems 2600XT is indeed a strange one in terms of pwr. Hmmm.

Juventas
Posts: 30
Joined: Wed Mar 07, 2007 10:25 pm
Location: Canada

Post by Juventas » Mon Jul 16, 2007 1:03 am

kater wrote:
Juventas wrote:Also, the X-bit link doesn't go to a specific article for me.
Yeah, it's because I used several of them - each gives you pwr consumption figures for various cards. Just click away and look for sections marked as "power consumption" etc.
Call me dense, but i'm still not finding them. :p Even used their search box, the phrase "power consumption" only came up in video articles going up till 2006.

kater
Posts: 891
Joined: Thu Sep 07, 2006 11:20 pm
Location: Poland

Post by kater » Mon Jul 16, 2007 1:13 am

Link 1

Link 2

Link 3

When I search their site I usually put the name of the gfx card and "power consumption" and I get some hits, then I narrow it down to "Video articles".

Just to clarify - I didn't mean Xbit provided any pwr numbers for 2600XT. I just wanted to say how I arrived at those figures using pwr cons for other cards like X1800GTO etc. and comparing VGA's measured by Xbit against the total pwr cons measured for systems by Techpowerup. Sorry if I've confused anyone. Happens to me every so often... Ah, us geniuses have it hard - nobody understands us ;)

Juventas
Posts: 30
Joined: Wed Mar 07, 2007 10:25 pm
Location: Canada

Post by Juventas » Mon Jul 16, 2007 1:19 am

jojo4u wrote:Here you are:
http://www.computerbase.de/artikel/hard ... gsaufnahme

Whole system, (3) = GDDR3, (4) = GDDR4. Voltage/frequency reduction is defective on the GDDR4 version.
Thanks for the link. I assume "Last" means some kind of heavy load? It made me realize that the first article in this thread is actually using a GDDR4-based 2600 XT, which has the odd power characteristic of little increase between idle and load, both articles support this. The GDDR3 model doesn't have this power characteristic. Even more interesting, despite the GDDR4 model consuming far more power overall, it barely manages to perform any better than the GDDR3 model!

jojo4u
Posts: 806
Joined: Sat Dec 14, 2002 7:00 am
Location: Germany

Post by jojo4u » Mon Jul 16, 2007 1:27 am

Juventas wrote: I assume "Last" means some kind of heavy load?
Indeed, it's 3DMark06.

david25
Posts: 89
Joined: Sun Nov 06, 2005 5:02 am

Post by david25 » Thu Jul 19, 2007 1:02 pm

This makes even more difficult

http://www.hwupgrade.com/articles/video ... up_15.html

What is going on with the manufacturers?

jojo4u
Posts: 806
Joined: Sat Dec 14, 2002 7:00 am
Location: Germany

Post by jojo4u » Thu Jul 19, 2007 1:30 pm

Quite a bit of variance ... 8600 GTS sometimer lower than GT in idle. The variance is reduced when the CPU is put under load but the not the GPU. In this case, the power draw is also only loosely couppled to the idle power draw. Strange.

pputer
Posts: 155
Joined: Wed May 02, 2007 5:05 am

Post by pputer » Fri Jul 27, 2007 3:24 am

It sounds like Nvidia 8600 GTS and ATI's HD 2600 series are similar with power consumption but Nvidia uses 80nm process while ATI's 2600 is 65nm.

I was wondering if you can use a DVI to HDMI adapter for Nvidia and get the HDMI audio or if only the ATI HD 2600 cards offer that feature.

I am comparing these cards for a new system using video encoding and general all-purpose tasking but mostly video work. I thought the Nvidia card would be good for extra choices including installing Linux but the HDMI features the ATI card gives is interesting. But, can I use the 8600 GTS for the same HTPC features and is the video encoding just as good or even better than ATI's?

Edit: I just read this article:
http://www.tomshardware.com/2007/06/08/ ... page6.html

Hmmm... Can anyone tell me how a cable connects from the video card to the sound card (or connected to the motherboard if it has an HDMI interface port) for sound over HDMI?

I will probably buy a sound card anyway and therefore, the Nvidia card sounds like a decent option. But, I'm not sure how you hook up the cable.

ryboto
Friend of SPCR
Posts: 1439
Joined: Tue Dec 14, 2004 4:06 pm
Location: New Hampshire, US
Contact:

Post by ryboto » Fri Jul 27, 2007 7:13 am

pputer wrote: Hmmm... Can anyone tell me how a cable connects from the video card to the sound card (or connected to the motherboard if it has an HDMI interface port) for sound over HDMI?
I was under the impression the Ati cards had an audio chip that let them process audio and output it through the HDMI. The nvidia cards don't have this.

Silentpipe
Posts: 4
Joined: Fri Jul 27, 2007 2:30 pm

HDMI Audio

Post by Silentpipe » Fri Jul 27, 2007 3:07 pm

This is a pretty good overview....
The entire HD 2000 series of cards offer HDMI connectivity with the help of a DVI adapter and all cards have support for HDCP. Unlike current HDMI implementations on PCIe graphics cards, the HD 2400, 2600 and 2900 integrate (secondary) audio functionality into the GPU. Instead of passing a PCM or Dolby Digital signal from onboard audio or a sound card, RV610 and RV630-based graphics cards can directly output audio – removing the need of a separate sound card over your HDMI connector. So you do not have to load sound to your graphics card which leads it to HDMI. Now, the card will receive its audio from e.g. your integrated audio solution and lead it straight towards the HDMI connector where it'll output that sound in 16-bit PCM Stereo sound or AC3 5.1 compressed multi-channel audiostreams as Dolby Digital and DTS. A pretty sexy feature, especially for those who use their PC as a HTPC and are connecting HDMI towards a HDMI receiver.

By the way do not be mistaken, for your add-in board (your X-Fi or whatever) the system S/PDIF output is not tied up by routing it to the graphics cards. It's completely a secondary process so you have full functionality over your primary soundcard.

So with the Series 2000 you'll receive a DVI-to HDMI adapter (a board partner option though) which, and make no mistake here, will carry sound over HDMI. That's unlike current DVI-HDMI adapters and cables which do not carry sound. Fantastic if you are watching a Blu-ray movie, simply connect HDMI to wards your HDTV for PCM sound, or connect it through a TrueHD/Dolby HD receiver and get that sound lovin' going on through that receiver of yours. All with one simple cable.


Hope that's interesting. :wink:

SP

jojo4u
Posts: 806
Joined: Sat Dec 14, 2002 7:00 am
Location: Germany

Post by jojo4u » Tue Aug 07, 2007 3:04 am

Btw, the GDDR4 version of the 2600 XT still has defective idle management. So don't buy it.

source: http://www.computerbase.de/news/hardwar ... d_2600_xt/

EDIT: http://www.behardware.com/articles/675- ... -2400.html

Post Reply