power consumption: 2600 Pro/XT and 8600 GT/GTS
Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee
power consumption: 2600 Pro/XT and 8600 GT/GTS
These cards are obviously marketed as direct competitors. They benchmark in the same ballpark for DX10, but I've been unable to find a direct comparison for power consumption. The best article I've found so far compares the 2600 XT to seemingly every card except its direct competitors, go figure.
http://www.techpowerup.com/reviews/ATI/HD_2600_XT/17
Update: Found another article with everything but the 2600 Pro. Generally it seems they're close enough that there's probably a larger difference between card manufacturers and ram types than there is between these GPUs. kater's hardspell article would seem to concure.
http://www.behardware.com/articles/675- ... -2400.html
http://www.techpowerup.com/reviews/ATI/HD_2600_XT/17
Update: Found another article with everything but the 2600 Pro. Generally it seems they're close enough that there's probably a larger difference between card manufacturers and ram types than there is between these GPUs. kater's hardspell article would seem to concure.
http://www.behardware.com/articles/675- ... -2400.html
Last edited by Juventas on Mon Jul 16, 2007 12:15 am, edited 1 time in total.
yeah, and it did it with total system power draw, so its completely useless data if you find an 8600 someplace else with that information. i hate when reviewers do that.
im not happy with the 8600's at all. their basically just DX10 7600's that draw 10 more watts. what a bunch of junk. hopefully nvidia turns it around and does it right on the next gen parts.
im not happy with the 8600's at all. their basically just DX10 7600's that draw 10 more watts. what a bunch of junk. hopefully nvidia turns it around and does it right on the next gen parts.
According to Xbit Labs 8600GTS goes up to 47W under load and idles at 21W - not bad at all. Techpowerup says a system with 2600XT needs just as many watts as a system with X1950XTX. X1950XTX needs, acc. to Xbit, 33W in idle mode. Techpowerup says a system with 2600XT takes 7W less than a system with X1800GTO, and X1800GTO needs 48W at full load. So my calculations either don't add up (I wouldn't be surprise given my school record, especially with regard to algebra), or 2600XT is one strange card, idling at 33W and topping at some 41W. Here's where I take Xbit data from - linky. Also, check out this article on Hardspell - linky - for some reason the site opens once in 10 times and shows pics v unwillingly. Maybe you'll have more luck I was only able to open the standby pwr cons graph.
As long as the testbed is the same it's fine with me. It's actually more accurate comparing total system since the video cards may be offloading processing to the CPU differently, or using the system RAM in different ways.Aris wrote:yeah, and it did it with total system power draw, so its completely useless data if you find an 8600 someplace else with that information. i hate when reviewers do that.
It's true if you're not interested in DX10 features, or HD video, there's little reason to bother with these cards. You get better bang per watt (and your buck) with most of the previous series.Aris wrote:im not happy with the 8600's at all. their basically just DX10 7600's that draw 10 more watts. what a bunch of junk. hopefully nvidia turns it around and does it right on the next gen parts.
I have noticed that the 2600's differ little between idle and load compared to other cards. Unfortunately, the idle figures aren't very low for it's class.kater wrote:or 2600XT is one strange card, idling at 33W and topping at some 41W.
Same here. But it is something. Also, the X-bit link doesn't go to a specific article for me.kater wrote: - for some reason the site opens once in 10 times and shows pics v unwillingly. Maybe you'll have more luck ;) I was only able to open the standby pwr cons graph.
Here you are:
http://www.computerbase.de/artikel/hard ... gsaufnahme
Whole system, (3) = GDDR3, (4) = GDDR4. Voltage/frequency reduction is defective on the GDDR4 version.
http://www.computerbase.de/artikel/hard ... gsaufnahme
Whole system, (3) = GDDR3, (4) = GDDR4. Voltage/frequency reduction is defective on the GDDR4 version.
Yeah, it's because I used several of them - each gives you pwr consumption figures for various cards. Just click away and look for sections marked as "power consumption" etc.Juventas wrote:Also, the X-bit link doesn't go to a specific article for me.
So it seems 2600XT is indeed a strange one in terms of pwr. Hmmm.
Call me dense, but i'm still not finding them. :p Even used their search box, the phrase "power consumption" only came up in video articles going up till 2006.kater wrote:Yeah, it's because I used several of them - each gives you pwr consumption figures for various cards. Just click away and look for sections marked as "power consumption" etc.Juventas wrote:Also, the X-bit link doesn't go to a specific article for me.
Link 1
Link 2
Link 3
When I search their site I usually put the name of the gfx card and "power consumption" and I get some hits, then I narrow it down to "Video articles".
Just to clarify - I didn't mean Xbit provided any pwr numbers for 2600XT. I just wanted to say how I arrived at those figures using pwr cons for other cards like X1800GTO etc. and comparing VGA's measured by Xbit against the total pwr cons measured for systems by Techpowerup. Sorry if I've confused anyone. Happens to me every so often... Ah, us geniuses have it hard - nobody understands us
Link 2
Link 3
When I search their site I usually put the name of the gfx card and "power consumption" and I get some hits, then I narrow it down to "Video articles".
Just to clarify - I didn't mean Xbit provided any pwr numbers for 2600XT. I just wanted to say how I arrived at those figures using pwr cons for other cards like X1800GTO etc. and comparing VGA's measured by Xbit against the total pwr cons measured for systems by Techpowerup. Sorry if I've confused anyone. Happens to me every so often... Ah, us geniuses have it hard - nobody understands us
Thanks for the link. I assume "Last" means some kind of heavy load? It made me realize that the first article in this thread is actually using a GDDR4-based 2600 XT, which has the odd power characteristic of little increase between idle and load, both articles support this. The GDDR3 model doesn't have this power characteristic. Even more interesting, despite the GDDR4 model consuming far more power overall, it barely manages to perform any better than the GDDR3 model!jojo4u wrote:Here you are:
http://www.computerbase.de/artikel/hard ... gsaufnahme
Whole system, (3) = GDDR3, (4) = GDDR4. Voltage/frequency reduction is defective on the GDDR4 version.
This makes even more difficult
http://www.hwupgrade.com/articles/video ... up_15.html
What is going on with the manufacturers?
http://www.hwupgrade.com/articles/video ... up_15.html
What is going on with the manufacturers?
Quite a bit of variance ... 8600 GTS sometimer lower than GT in idle. The variance is reduced when the CPU is put under load but the not the GPU. In this case, the power draw is also only loosely couppled to the idle power draw. Strange.david25 wrote: http://www.hwupgrade.com/articles/video ... up_15.html
It sounds like Nvidia 8600 GTS and ATI's HD 2600 series are similar with power consumption but Nvidia uses 80nm process while ATI's 2600 is 65nm.
I was wondering if you can use a DVI to HDMI adapter for Nvidia and get the HDMI audio or if only the ATI HD 2600 cards offer that feature.
I am comparing these cards for a new system using video encoding and general all-purpose tasking but mostly video work. I thought the Nvidia card would be good for extra choices including installing Linux but the HDMI features the ATI card gives is interesting. But, can I use the 8600 GTS for the same HTPC features and is the video encoding just as good or even better than ATI's?
Edit: I just read this article:
http://www.tomshardware.com/2007/06/08/ ... page6.html
Hmmm... Can anyone tell me how a cable connects from the video card to the sound card (or connected to the motherboard if it has an HDMI interface port) for sound over HDMI?
I will probably buy a sound card anyway and therefore, the Nvidia card sounds like a decent option. But, I'm not sure how you hook up the cable.
I was wondering if you can use a DVI to HDMI adapter for Nvidia and get the HDMI audio or if only the ATI HD 2600 cards offer that feature.
I am comparing these cards for a new system using video encoding and general all-purpose tasking but mostly video work. I thought the Nvidia card would be good for extra choices including installing Linux but the HDMI features the ATI card gives is interesting. But, can I use the 8600 GTS for the same HTPC features and is the video encoding just as good or even better than ATI's?
Edit: I just read this article:
http://www.tomshardware.com/2007/06/08/ ... page6.html
Hmmm... Can anyone tell me how a cable connects from the video card to the sound card (or connected to the motherboard if it has an HDMI interface port) for sound over HDMI?
I will probably buy a sound card anyway and therefore, the Nvidia card sounds like a decent option. But, I'm not sure how you hook up the cable.
-
- Friend of SPCR
- Posts: 1439
- Joined: Tue Dec 14, 2004 4:06 pm
- Location: New Hampshire, US
- Contact:
I was under the impression the Ati cards had an audio chip that let them process audio and output it through the HDMI. The nvidia cards don't have this.pputer wrote: Hmmm... Can anyone tell me how a cable connects from the video card to the sound card (or connected to the motherboard if it has an HDMI interface port) for sound over HDMI?
-
- Posts: 4
- Joined: Fri Jul 27, 2007 2:30 pm
HDMI Audio
This is a pretty good overview....
The entire HD 2000 series of cards offer HDMI connectivity with the help of a DVI adapter and all cards have support for HDCP. Unlike current HDMI implementations on PCIe graphics cards, the HD 2400, 2600 and 2900 integrate (secondary) audio functionality into the GPU. Instead of passing a PCM or Dolby Digital signal from onboard audio or a sound card, RV610 and RV630-based graphics cards can directly output audio – removing the need of a separate sound card over your HDMI connector. So you do not have to load sound to your graphics card which leads it to HDMI. Now, the card will receive its audio from e.g. your integrated audio solution and lead it straight towards the HDMI connector where it'll output that sound in 16-bit PCM Stereo sound or AC3 5.1 compressed multi-channel audiostreams as Dolby Digital and DTS. A pretty sexy feature, especially for those who use their PC as a HTPC and are connecting HDMI towards a HDMI receiver.
By the way do not be mistaken, for your add-in board (your X-Fi or whatever) the system S/PDIF output is not tied up by routing it to the graphics cards. It's completely a secondary process so you have full functionality over your primary soundcard.
So with the Series 2000 you'll receive a DVI-to HDMI adapter (a board partner option though) which, and make no mistake here, will carry sound over HDMI. That's unlike current DVI-HDMI adapters and cables which do not carry sound. Fantastic if you are watching a Blu-ray movie, simply connect HDMI to wards your HDTV for PCM sound, or connect it through a TrueHD/Dolby HD receiver and get that sound lovin' going on through that receiver of yours. All with one simple cable.
Hope that's interesting.
SP
The entire HD 2000 series of cards offer HDMI connectivity with the help of a DVI adapter and all cards have support for HDCP. Unlike current HDMI implementations on PCIe graphics cards, the HD 2400, 2600 and 2900 integrate (secondary) audio functionality into the GPU. Instead of passing a PCM or Dolby Digital signal from onboard audio or a sound card, RV610 and RV630-based graphics cards can directly output audio – removing the need of a separate sound card over your HDMI connector. So you do not have to load sound to your graphics card which leads it to HDMI. Now, the card will receive its audio from e.g. your integrated audio solution and lead it straight towards the HDMI connector where it'll output that sound in 16-bit PCM Stereo sound or AC3 5.1 compressed multi-channel audiostreams as Dolby Digital and DTS. A pretty sexy feature, especially for those who use their PC as a HTPC and are connecting HDMI towards a HDMI receiver.
By the way do not be mistaken, for your add-in board (your X-Fi or whatever) the system S/PDIF output is not tied up by routing it to the graphics cards. It's completely a secondary process so you have full functionality over your primary soundcard.
So with the Series 2000 you'll receive a DVI-to HDMI adapter (a board partner option though) which, and make no mistake here, will carry sound over HDMI. That's unlike current DVI-HDMI adapters and cables which do not carry sound. Fantastic if you are watching a Blu-ray movie, simply connect HDMI to wards your HDTV for PCM sound, or connect it through a TrueHD/Dolby HD receiver and get that sound lovin' going on through that receiver of yours. All with one simple cable.
Hope that's interesting.
SP
Btw, the GDDR4 version of the 2600 XT still has defective idle management. So don't buy it.
source: http://www.computerbase.de/news/hardwar ... d_2600_xt/
EDIT: http://www.behardware.com/articles/675- ... -2400.html
source: http://www.computerbase.de/news/hardwar ... d_2600_xt/
EDIT: http://www.behardware.com/articles/675- ... -2400.html