X-bit labs: Power consumption of modern graphic cards

They make noise, too.

Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee

Slaugh
Posts: 774
Joined: Thu Dec 02, 2004 2:27 am
Location: Quebec, Canada

X-bit labs: Power consumption of modern graphic cards

Post by Slaugh » Sun Feb 05, 2006 11:07 pm

Here's another X-bit labs article about power consumption of current video cards:

The Grand Clash for Watts: Power Consumption of Modern Graphics Cards


And the results are... Image


Image Image
Image Image
Image


Even if X-bit labs states that the noise level is an important factor, it's far from SPCR standards. Here's what they say about the premium video cards (this includes the X1900 XTX):
The noise level created by the graphics cards we used is approximately similar: all the graphics boards can barely be heard, especially the GeForce 7800 GTX 512. If these boards are installed into a high-quality computer case, then there will be no problems with noise at all.

EDIT: Added the "mainstream power consumption" chart. Thanks rpsgc! :wink:
Last edited by Slaugh on Tue Feb 14, 2006 1:47 am, edited 1 time in total.

El Doug
Posts: 103
Joined: Sun Jan 01, 2006 6:32 am

Post by El Doug » Sun Feb 05, 2006 11:47 pm

silent, eh? :lol:

on a side note, how is my gtx now not a premium card? geez new stuff is coming out fast

anaqer
Posts: 44
Joined: Sun Jun 26, 2005 11:31 pm

Post by anaqer » Mon Feb 06, 2006 3:54 am

Not directly related to the test itself, but would somebody happen to know how many amps the universal AGP port can draw from the +12V rail? I've got a Solano-based system that I wish to upgrade with a GF 6200A, but the power supply is rated for a measly 4.2A. The DVD-ROM and the hard drive can draw up to 1.9A, plus there's the load from the motherboard itself and the rest of the system (SB Live24bit, two Realtek GbE NICs and two very slow 80mm fans) so I wonder if the new card would still fit the power budget... :?:

Tephras
Posts: 1140
Joined: Tue Sep 07, 2004 11:03 am
Location: Europe

Post by Tephras » Mon Feb 06, 2006 9:45 am

Max current for the 12V line in the AGP 3.0 specification is 1A.

Aris
Posts: 2299
Joined: Mon Dec 15, 2003 10:29 am
Location: Bellevue, Nebraska
Contact:

Post by Aris » Mon Feb 13, 2006 5:11 am

why isnt the x1600xt on their? i always have the worst time trying to find power consumption charts on x-bits website. personally i think they should have a section for just power consumption of components.

rpsgc
Friend of SPCR
Posts: 1630
Joined: Tue Oct 05, 2004 1:59 am
Location: Portugal

Post by rpsgc » Mon Feb 13, 2006 5:19 am

Aris wrote:why isnt the x1600xt on their? i always have the worst time trying to find power consumption charts on x-bits website. personally i think they should have a section for just power consumption of components.
It is. Slaugh here forgot to post the Mainstream chart:

Image

Aris
Posts: 2299
Joined: Mon Dec 15, 2003 10:29 am
Location: Bellevue, Nebraska
Contact:

Post by Aris » Mon Feb 13, 2006 5:44 am

i looked up some framerate comparisons with the x800xl, x800gt, x800gto, 6600gt, and x1600xt. all of which are short enough to fit in smaller enclosures which matters to me. out of all of them, the x800xl seems to come out on top.

now i just need to see if the upcomming 7600 from nvidia can beat the x800xl without consuming more than 50watts. lets cross our fingers, i'd personally prefer to stick with nvidia.

EDIT: oh yeah, and be competitive price wise to the x800xl. which can be had for under 200 bucks now.

rpsgc
Friend of SPCR
Posts: 1630
Joined: Tue Oct 05, 2004 1:59 am
Location: Portugal

Post by rpsgc » Mon Feb 13, 2006 6:43 am

Rumour

The 7600GS is 10-15% faster than the 6600GT.
The 7600GT scores equivalent to the X1600XT in 3DMark2005. Expected to be better in games.

Tzupy
*Lifetime Patron*
Posts: 1561
Joined: Wed Jan 12, 2005 10:47 am
Location: Bucharest, Romania

Post by Tzupy » Mon Feb 13, 2006 7:24 am

Even if it comes with just 3+8 (enhanced) pipes and 128-bit memory, if the clocks are 700 / 1400, the 7600GT should be ~50% faster than the 6600GT. The vanilla 7600 could be just 10% faster than the 6600GT, and easy to cool passively.

rpsgc
Friend of SPCR
Posts: 1630
Joined: Tue Oct 05, 2004 1:59 am
Location: Portugal

Post by rpsgc » Mon Feb 13, 2006 8:21 am

Tzupy wrote:Even if it comes with just 3+8 (enhanced) pipes and 128-bit memory, if the clocks are 700 / 1400, the 7600GT should be ~50% faster than the 6600GT. The vanilla 7600 could be just 10% faster than the 6600GT, and easy to cool passively.
7600GS: 8 pipelines + 3VS; 500/1GHz
7600GT: 12 pipelines + 5VS; 500/1GHz

~El~Jefe~
Friend of SPCR
Posts: 2887
Joined: Mon Feb 28, 2005 4:21 pm
Location: New York City zzzz
Contact:

Post by ~El~Jefe~ » Mon Feb 13, 2006 8:23 am

x800XL does normally win the cost to performance to wattage contest.

from there it is the 7800GT.

an x800XL uses less power than a 9800 Pro, which almost can run passive but not quite. (i know I have the 9800 pro)

Aris
Posts: 2299
Joined: Mon Dec 15, 2003 10:29 am
Location: Bellevue, Nebraska
Contact:

Post by Aris » Mon Feb 13, 2006 8:29 am

yeah i had a 9800pro before my 6600gt, and i could run it passive 2d all the time, but after about 30min of 3dgames the system will lock without some sort of active cooling on it.

i may end up going with the x800xl just because of cost unless the 7600gt just totally blows it out of the water in power consumption, and framerates. because i highly doubt it will debut at under 200 bucks

Tzupy
*Lifetime Patron*
Posts: 1561
Joined: Wed Jan 12, 2005 10:47 am
Location: Bucharest, Romania

Post by Tzupy » Mon Feb 13, 2006 10:30 am

7600GS: 8 pipelines + 3VS; 500/1GHz
7600GT: 12 pipelines + 5VS; 500/1GHz

I have seen those specs before, but have doubts about them. The 90 nm low-k process is supposed to easily reach 700 MHz for the core. And since the memory stays 128-bit, just 1 GHz for the 7600GT would be crippling. I find it hard to believe that nVidia would make two designs, 5+12 and 3+8. If the 7600GS would have defective / disabled pipes, that would mean a poor process implementation by nVidia (and the foundry).
On the other hand, 5+12 pipes @ 500 MHz could be achieved with low voltage, so 7600GT would be easy to cool passively.
Well, in one month, all this speculation will be pointless. Come on nVidia, hurry up your lazy a**es, we are waiting for the new cards!

Beyonder
Posts: 757
Joined: Wed Sep 11, 2002 11:56 pm
Location: EARTH.

Post by Beyonder » Mon Feb 13, 2006 2:08 pm

It'd be nice if they'd normalize those values and show them in a single graph, rather than having varying x-axis values.

It'd be even nicer if they could then divide wattage by some generalized performance metric (for example, some composite score of a group of benchmarks) to print out what card gives the most "bang" per watt. But, I suppose beggars can be choosers. Were I not so lazy, I'd do it myself in Excel.

narrasuj
Posts: 51
Joined: Wed Aug 10, 2005 7:57 pm
Location: United States
Contact:

Good overall performance indicator?

Post by narrasuj » Mon Feb 13, 2006 3:03 pm

I was thinking of how best to quantify performance, cost and heat output all in one number, just for fun. So to make sure I'm making sense, I thought I'd post it up on here and maybe try to find some numbers if I've got a chance to: 3DMark score (whichever version I can find for the diff cards I'll be looking at) over (cost multiplied by heat production in watts). So it'd be something like: score/($ x W). If that works, what cards would be interesting to see? If there's too many suggestions, I'll just take the first 5 or so.

Aris
Posts: 2299
Joined: Mon Dec 15, 2003 10:29 am
Location: Bellevue, Nebraska
Contact:

Re: Good overall performance indicator?

Post by Aris » Mon Feb 13, 2006 4:35 pm

narrasuj wrote:I was thinking of how best to quantify performance, cost and heat output all in one number, just for fun. So to make sure I'm making sense, I thought I'd post it up on here and maybe try to find some numbers if I've got a chance to: 3DMark score (whichever version I can find for the diff cards I'll be looking at) over (cost multiplied by heat production in watts). So it'd be something like: score/($ x W). If that works, what cards would be interesting to see? If there's too many suggestions, I'll just take the first 5 or so.
you would need a way to "weight" each value depending on the individual. some people want the best framerate per watt and will pay anything to get it, and to those people the cost wouldnt mean so much in the equation.

psiu
Posts: 1201
Joined: Tue Aug 23, 2005 1:53 pm
Location: SE MI

Post by psiu » Mon Feb 13, 2006 5:09 pm

Obviously my X800XT Platinum Edition (ooooohhhaaaahhhh) is the best card out there.

Got it on clearance from ATI with 3 year warranty...and it's the last one my wife will let me buy for 5 years :lol: :cry: :oops: :? :D

Beyonder
Posts: 757
Joined: Wed Sep 11, 2002 11:56 pm
Location: EARTH.

Re: Good overall performance indicator?

Post by Beyonder » Mon Feb 13, 2006 5:38 pm

Aris wrote:
narrasuj wrote:I was thinking of how best to quantify performance, cost and heat output all in one number, just for fun. So to make sure I'm making sense, I thought I'd post it up on here and maybe try to find some numbers if I've got a chance to: 3DMark score (whichever version I can find for the diff cards I'll be looking at) over (cost multiplied by heat production in watts). So it'd be something like: score/($ x W). If that works, what cards would be interesting to see? If there's too many suggestions, I'll just take the first 5 or so.
you would need a way to "weight" each value depending on the individual. some people want the best framerate per watt and will pay anything to get it, and to those people the cost wouldnt mean so much in the equation.
something like this?

Code: Select all

// dividing by zero is bad
if( CostFactor == 0 || WattFactor == 0 )
    return;

// no negative values here, must add up to one
if( 1 != abs( CostFactor ) + abs( WattFactor ))
    return;

OverallScore = BenchmarkCompositeScore / ( ( abs(CostFactor) * cost ) * ( abs(WattFactor) * wattage ) );
....and I guess the units would be Performance/Watt-Dollars? P/WD? :D Or if you had someone who didn't care about cost/heat, it'd be P/W or P/D. I only care about P/W, really.

Beyonder
Posts: 757
Joined: Wed Sep 11, 2002 11:56 pm
Location: EARTH.

Post by Beyonder » Mon Feb 13, 2006 5:40 pm

pardon the semicolons, force of habit

narrasuj
Posts: 51
Joined: Wed Aug 10, 2005 7:57 pm
Location: United States
Contact:

Post by narrasuj » Mon Feb 13, 2006 7:02 pm

From what I can understand of that code, it's a sort of percentage multiplier for wattage and cost? If so, why not have a percent for all three factors and then a person could rank which of the three are important to them, and associate percentage values to them. If something like this were implemented in some reasonable fashion, I think it could be applied to processors as well, no?

Aris
Posts: 2299
Joined: Mon Dec 15, 2003 10:29 am
Location: Bellevue, Nebraska
Contact:

Post by Aris » Tue Feb 14, 2006 8:05 am

another thing to mabey add to it, is a way to set cutoffs. like if someone wants all cards between $100 and $200. or all cards under 50watts. and then have it list them within those perameters weighted by their preferences. and then have the output be able to be sorted by the different values so you can easily look at the output from several angles.

prices may be difficult as they always change. so rather than real world prices you could just put in the initial MSRP from either ati or nvidia for that card.

also, give the option to search by only nvidia cards, or only ati cards or both.

then throw it all up on a clean easy to read website so we can plug in the numbers and see what comes out :)

Le_Gritche
Posts: 140
Joined: Wed Jan 18, 2006 4:57 am
Location: France, Lyon

Post by Le_Gritche » Tue Feb 14, 2006 10:58 am

You have to put in the graph all the info needed by the reader. If price/power/performance are of use then you have to put the three.
3D graph are quiet hard to read, especially when you deal with discreet points rather than with continuous area (like altitude on a 3D geographical map that you can draw with different colours)
On a 2D graph, if you keep the 3 parameters you have to weight them. The exemple 'score/($ x W)' suggested, gives the same importance to price and power. Some people favour price, other power, so that's not a solution.
Narrasuj suggested a dynamic graph where the reader can weight himself between the 3 parameters. That's significantly harder to implement on the web than a simple jpeg graph.
A solution would be to put only 2 parameters in the graph, and keep the third out of it.
Aris wrote:prices may be difficult as they always change. so rather than real world prices you could just put in the initial MSRP from either ati or nvidia for that card.
Good point, I remember seing a graph of processors rated according to their price and performance. The graph was one year old and was sadly totally useless as prices of Athlon, Duron, Pentium and Celeron got different price cuts during that time.
So puting the prices in the graph would need a regular update of the graph or it will become inaccurate after 6 months or so.

Performance and power consumption being the only 2 numbers staying constant over the life of the graphic card, they are probably the easier one to put in the graph. You can then insert under the graph the lower price of each card hotlinked from a price grabbing website.

But I keep writing and realize I have absolutely no will to do it myself, has anyone voluntereed yet or have we to choose a volunteer ? :lol:

Beyonder
Posts: 757
Joined: Wed Sep 11, 2002 11:56 pm
Location: EARTH.

Post by Beyonder » Wed Feb 15, 2006 8:14 am

Le_Gritche wrote:You have to put in the graph all the info needed by the reader. If price/power/performance are of use then you have to put the three.
No, you only need the final output. Those three are all inputs--the only thing you'd be graphing is the composite score. So, a regular bar chart would work fine.
Good point, I remember seing a graph of processors rated according to their price and performance. The graph was one year old and was sadly totally useless as prices of Athlon, Duron, Pentium and Celeron got different price cuts during that time.
This is also a good point--price isn't a constant, but wattage definitely is and performance (for the most part) doesn't change much.
But I keep writing and realize I have absolutely no will to do it myself, has anyone voluntereed yet or have we to choose a volunteer ? :lol:
the buck does stop there, doesn't it? heh. :D

Hyphe
Posts: 62
Joined: Tue Jun 21, 2005 12:45 pm
Location: Sweden

Post by Hyphe » Wed Feb 15, 2006 8:36 am

Here's an overview chart of most of the numbers from X-bit labs (Both from the latest article and some cards from the earlier articles, sources can be found in the Excel-file):

Image

Here's the Excel file if you want to play with the figures:

http://www.wnd.se/htpc/Power-consumptio ... -cards.xls

Aris
Posts: 2299
Joined: Mon Dec 15, 2003 10:29 am
Location: Bellevue, Nebraska
Contact:

Post by Aris » Wed Feb 15, 2006 10:23 am

just realized the fastest cards i can fit in my system are a 6600gt or x1600xt. x800xl is about 1/10" too long.

i just find it so hard to believe that the exact same card i bought over a year ago is still my best option from nvidia now. nvidia needs to hurry up and release the 7600gt. 6600gt is a good card, but theirs no way i'm going to build my next "upgrade" system and put the exact same video card in it.

Beyonder
Posts: 757
Joined: Wed Sep 11, 2002 11:56 pm
Location: EARTH.

Post by Beyonder » Wed Feb 15, 2006 2:13 pm

Image

Basically, I went through xbitlab's video card reviews, and yanked out the 3dmark05 scores. They all happened to be done on the exact same machine (some athlon64 4000+ rig), so it's pretty much apples to apples, assuming xbitlabs is consistent (they seem pretty "on the ball," so I'm assuming this is the case). In the xls files are hyperlinks to all the reviews that were referenced. I wouldn't mind some doublechecking--it was rather confusing to find and reference all of the values.

Two video cards, I couldn't find scores for.

The clear winner is the S3 card, which pains me a bit because I despise S3. Their driver and compatibility issues of days past are enough to leave a bad taste in anyone's mouth, but perhaps it's time to let it go and give it another try. Two of the ATI cards also had very good performance to wattage ratios (x1800 xl and x1600 xt), so those might be good candidates for those of us who aren't S3 fans. :P

For NVidia, the highest P/W cards were the 7800 GT and the 6600 DDR2, although neither had a very impressive showing next to the S3/ATI cards.

This is all with a synthetic benchmark, so it may very well be that real gaming performance yields quite different results. Maybe it should have FarCry/Doom3 numbers as well...

Modified excel file

Hyphe
Posts: 62
Joined: Tue Jun 21, 2005 12:45 pm
Location: Sweden

Post by Hyphe » Wed Feb 15, 2006 2:24 pm

Updated with some Performance per watt figures and chart. Sources can be found in the Excel-file, with comments on figures that deviates from the power consumption tests and benchmarks). I am not sure if I calculated correct so give me some input if you find anything not correct:

Image

The Excel file is updated, feel free to develop it, check the earlier link.

Beyonder
Posts: 757
Joined: Wed Sep 11, 2002 11:56 pm
Location: EARTH.

Post by Beyonder » Wed Feb 15, 2006 2:28 pm

wow--surprising difference between 05 and 06. Thanks

mattthemuppet
*Lifetime Patron*
Posts: 618
Joined: Mon May 23, 2005 7:05 am
Location: State College, PA

Post by mattthemuppet » Wed Feb 15, 2006 3:15 pm

impressive work guys - I was idly considering doing something similar so you've saved me the effort! Looks like a 6800 AGP is the best card for me (Skt A board) - it should comfortably whip the pants off my existing FX5200 :)

frostedflakes
Posts: 1608
Joined: Tue Jan 04, 2005 4:02 pm
Location: United States

Post by frostedflakes » Wed Feb 15, 2006 5:03 pm

Two very informative charts, nice work guys! You should definitely post these in the VGA card power dissipation sticky.

Post Reply