It is currently Fri Sep 21, 2018 9:13 pm

All times are UTC - 8 hours




Post new topic Reply to topic  [ 72 posts ]  Go to page 1, 2, 3  Next
Author Message
 Post subject: X-bit labs: Power consumption of modern graphic cards
PostPosted: Sun Feb 05, 2006 11:07 pm 
Offline

Joined: Thu Dec 02, 2004 2:27 am
Posts: 774
Location: Quebec, Canada
Here's another X-bit labs article about power consumption of current video cards:

The Grand Clash for Watts: Power Consumption of Modern Graphics Cards


And the results are... Image


Image Image
Image Image
Image


Even if X-bit labs states that the noise level is an important factor, it's far from SPCR standards. Here's what they say about the premium video cards (this includes the X1900 XTX):

Quote:
The noise level created by the graphics cards we used is approximately similar: all the graphics boards can barely be heard, especially the GeForce 7800 GTX 512. If these boards are installed into a high-quality computer case, then there will be no problems with noise at all.



EDIT: Added the "mainstream power consumption" chart. Thanks rpsgc! :wink:

_________________
Zalman LQ1000 liquid cooled case + dampening material + PVC tubes + MCP350 with EK top (acetal) and decoupling kit + Nexus 120mm + Xigmatek 200mm + HDD cage removed | Asus Maximus III Formula (ROG) | Intel Core I7 870 + EK Supreme (acetal) | 8GB Kingston | AMD Radeon HD 5850 1GB O/C + EK full cover block (acetal) | Western Digital Black 1TB + NoVibes III + Nexus 120mm (5V) | Seasonic X-850


Last edited by Slaugh on Tue Feb 14, 2006 1:47 am, edited 1 time in total.

Top
 Profile  
 
 Post subject:
PostPosted: Sun Feb 05, 2006 11:47 pm 
Offline

Joined: Sun Jan 01, 2006 6:32 am
Posts: 103
silent, eh? :lol:

on a side note, how is my gtx now not a premium card? geez new stuff is coming out fast


Top
 Profile  
 
 Post subject:
PostPosted: Mon Feb 06, 2006 3:54 am 
Offline

Joined: Sun Jun 26, 2005 11:31 pm
Posts: 44
Not directly related to the test itself, but would somebody happen to know how many amps the universal AGP port can draw from the +12V rail? I've got a Solano-based system that I wish to upgrade with a GF 6200A, but the power supply is rated for a measly 4.2A. The DVD-ROM and the hard drive can draw up to 1.9A, plus there's the load from the motherboard itself and the rest of the system (SB Live24bit, two Realtek GbE NICs and two very slow 80mm fans) so I wonder if the new card would still fit the power budget... :?:


Top
 Profile  
 
 Post subject:
PostPosted: Mon Feb 06, 2006 9:45 am 
Offline

Joined: Tue Sep 07, 2004 11:03 am
Posts: 1140
Location: Europe
Max current for the 12V line in the AGP 3.0 specification is 1A.


Top
 Profile  
 
 Post subject:
PostPosted: Mon Feb 13, 2006 5:11 am 
Offline

Joined: Mon Dec 15, 2003 10:29 am
Posts: 2299
Location: Bellevue, Nebraska
why isnt the x1600xt on their? i always have the worst time trying to find power consumption charts on x-bits website. personally i think they should have a section for just power consumption of components.


Top
 Profile  
 
 Post subject:
PostPosted: Mon Feb 13, 2006 5:19 am 
Offline
Friend of SPCR

Joined: Tue Oct 05, 2004 1:59 am
Posts: 1630
Location: Portugal
Aris wrote:
why isnt the x1600xt on their? i always have the worst time trying to find power consumption charts on x-bits website. personally i think they should have a section for just power consumption of components.


It is. Slaugh here forgot to post the Mainstream chart:

Image

_________________
Fractal Define R4 | Corsair AX750 | MSI Z97 Gaming 5 | Intel Core i7 4770K w/TRUE 120 Rev. C | 16GB G.Skill Sniper DDR3-1866 | Sapphire Nitro+ RX 480 8GB OC | Crucial m4 128GB + Crucial MX100 512GB + WD Blue 1TB + WD Red 4TB | JVC HA-RX900 | Dell U2312HM + BenQ G2400WD | Asus Echelon Mechanical Keyboard | Logitech G400s


Top
 Profile  
 
 Post subject:
PostPosted: Mon Feb 13, 2006 5:44 am 
Offline

Joined: Mon Dec 15, 2003 10:29 am
Posts: 2299
Location: Bellevue, Nebraska
i looked up some framerate comparisons with the x800xl, x800gt, x800gto, 6600gt, and x1600xt. all of which are short enough to fit in smaller enclosures which matters to me. out of all of them, the x800xl seems to come out on top.

now i just need to see if the upcomming 7600 from nvidia can beat the x800xl without consuming more than 50watts. lets cross our fingers, i'd personally prefer to stick with nvidia.

EDIT: oh yeah, and be competitive price wise to the x800xl. which can be had for under 200 bucks now.


Top
 Profile  
 
 Post subject:
PostPosted: Mon Feb 13, 2006 6:43 am 
Offline
Friend of SPCR

Joined: Tue Oct 05, 2004 1:59 am
Posts: 1630
Location: Portugal
Rumour

The 7600GS is 10-15% faster than the 6600GT.
The 7600GT scores equivalent to the X1600XT in 3DMark2005. Expected to be better in games.

_________________
Fractal Define R4 | Corsair AX750 | MSI Z97 Gaming 5 | Intel Core i7 4770K w/TRUE 120 Rev. C | 16GB G.Skill Sniper DDR3-1866 | Sapphire Nitro+ RX 480 8GB OC | Crucial m4 128GB + Crucial MX100 512GB + WD Blue 1TB + WD Red 4TB | JVC HA-RX900 | Dell U2312HM + BenQ G2400WD | Asus Echelon Mechanical Keyboard | Logitech G400s


Top
 Profile  
 
 Post subject:
PostPosted: Mon Feb 13, 2006 7:24 am 
Offline
*Lifetime Patron*

Joined: Wed Jan 12, 2005 10:47 am
Posts: 1554
Location: Bucharest, Romania
Even if it comes with just 3+8 (enhanced) pipes and 128-bit memory, if the clocks are 700 / 1400, the 7600GT should be ~50% faster than the 6600GT. The vanilla 7600 could be just 10% faster than the 6600GT, and easy to cool passively.


Top
 Profile  
 
 Post subject:
PostPosted: Mon Feb 13, 2006 8:21 am 
Offline
Friend of SPCR

Joined: Tue Oct 05, 2004 1:59 am
Posts: 1630
Location: Portugal
Tzupy wrote:
Even if it comes with just 3+8 (enhanced) pipes and 128-bit memory, if the clocks are 700 / 1400, the 7600GT should be ~50% faster than the 6600GT. The vanilla 7600 could be just 10% faster than the 6600GT, and easy to cool passively.


7600GS: 8 pipelines + 3VS; 500/1GHz
7600GT: 12 pipelines + 5VS; 500/1GHz

_________________
Fractal Define R4 | Corsair AX750 | MSI Z97 Gaming 5 | Intel Core i7 4770K w/TRUE 120 Rev. C | 16GB G.Skill Sniper DDR3-1866 | Sapphire Nitro+ RX 480 8GB OC | Crucial m4 128GB + Crucial MX100 512GB + WD Blue 1TB + WD Red 4TB | JVC HA-RX900 | Dell U2312HM + BenQ G2400WD | Asus Echelon Mechanical Keyboard | Logitech G400s


Top
 Profile  
 
 Post subject:
PostPosted: Mon Feb 13, 2006 8:23 am 
Offline
Friend of SPCR

Joined: Mon Feb 28, 2005 4:21 pm
Posts: 2887
Location: New York City zzzz
x800XL does normally win the cost to performance to wattage contest.

from there it is the 7800GT.

an x800XL uses less power than a 9800 Pro, which almost can run passive but not quite. (i know I have the 9800 pro)


Top
 Profile  
 
 Post subject:
PostPosted: Mon Feb 13, 2006 8:29 am 
Offline

Joined: Mon Dec 15, 2003 10:29 am
Posts: 2299
Location: Bellevue, Nebraska
yeah i had a 9800pro before my 6600gt, and i could run it passive 2d all the time, but after about 30min of 3dgames the system will lock without some sort of active cooling on it.

i may end up going with the x800xl just because of cost unless the 7600gt just totally blows it out of the water in power consumption, and framerates. because i highly doubt it will debut at under 200 bucks


Top
 Profile  
 
 Post subject:
PostPosted: Mon Feb 13, 2006 10:30 am 
Offline
*Lifetime Patron*

Joined: Wed Jan 12, 2005 10:47 am
Posts: 1554
Location: Bucharest, Romania
7600GS: 8 pipelines + 3VS; 500/1GHz
7600GT: 12 pipelines + 5VS; 500/1GHz

I have seen those specs before, but have doubts about them. The 90 nm low-k process is supposed to easily reach 700 MHz for the core. And since the memory stays 128-bit, just 1 GHz for the 7600GT would be crippling. I find it hard to believe that nVidia would make two designs, 5+12 and 3+8. If the 7600GS would have defective / disabled pipes, that would mean a poor process implementation by nVidia (and the foundry).
On the other hand, 5+12 pipes @ 500 MHz could be achieved with low voltage, so 7600GT would be easy to cool passively.
Well, in one month, all this speculation will be pointless. Come on nVidia, hurry up your lazy a**es, we are waiting for the new cards!


Top
 Profile  
 
 Post subject:
PostPosted: Mon Feb 13, 2006 2:08 pm 
Offline

Joined: Wed Sep 11, 2002 11:56 pm
Posts: 757
Location: EARTH.
It'd be nice if they'd normalize those values and show them in a single graph, rather than having varying x-axis values.

It'd be even nicer if they could then divide wattage by some generalized performance metric (for example, some composite score of a group of benchmarks) to print out what card gives the most "bang" per watt. But, I suppose beggars can be choosers. Were I not so lazy, I'd do it myself in Excel.

_________________
Fun: GOLDFISHY
Work: Video Surveillance


Top
 Profile  
 
 Post subject: Good overall performance indicator?
PostPosted: Mon Feb 13, 2006 3:03 pm 
Offline

Joined: Wed Aug 10, 2005 7:57 pm
Posts: 51
Location: United States
I was thinking of how best to quantify performance, cost and heat output all in one number, just for fun. So to make sure I'm making sense, I thought I'd post it up on here and maybe try to find some numbers if I've got a chance to: 3DMark score (whichever version I can find for the diff cards I'll be looking at) over (cost multiplied by heat production in watts). So it'd be something like: score/($ x W). If that works, what cards would be interesting to see? If there's too many suggestions, I'll just take the first 5 or so.


Top
 Profile  
 
 Post subject: Re: Good overall performance indicator?
PostPosted: Mon Feb 13, 2006 4:35 pm 
Offline

Joined: Mon Dec 15, 2003 10:29 am
Posts: 2299
Location: Bellevue, Nebraska
narrasuj wrote:
I was thinking of how best to quantify performance, cost and heat output all in one number, just for fun. So to make sure I'm making sense, I thought I'd post it up on here and maybe try to find some numbers if I've got a chance to: 3DMark score (whichever version I can find for the diff cards I'll be looking at) over (cost multiplied by heat production in watts). So it'd be something like: score/($ x W). If that works, what cards would be interesting to see? If there's too many suggestions, I'll just take the first 5 or so.


you would need a way to "weight" each value depending on the individual. some people want the best framerate per watt and will pay anything to get it, and to those people the cost wouldnt mean so much in the equation.


Top
 Profile  
 
 Post subject:
PostPosted: Mon Feb 13, 2006 5:09 pm 
Offline

Joined: Tue Aug 23, 2005 1:53 pm
Posts: 1201
Location: Plymouth, MI
Obviously my X800XT Platinum Edition (ooooohhhaaaahhhh) is the best card out there.

Got it on clearance from ATI with 3 year warranty...and it's the last one my wife will let me buy for 5 years :lol: :cry: :oops: :? :D


Top
 Profile  
 
 Post subject: Re: Good overall performance indicator?
PostPosted: Mon Feb 13, 2006 5:38 pm 
Offline

Joined: Wed Sep 11, 2002 11:56 pm
Posts: 757
Location: EARTH.
Aris wrote:
narrasuj wrote:
I was thinking of how best to quantify performance, cost and heat output all in one number, just for fun. So to make sure I'm making sense, I thought I'd post it up on here and maybe try to find some numbers if I've got a chance to: 3DMark score (whichever version I can find for the diff cards I'll be looking at) over (cost multiplied by heat production in watts). So it'd be something like: score/($ x W). If that works, what cards would be interesting to see? If there's too many suggestions, I'll just take the first 5 or so.


you would need a way to "weight" each value depending on the individual. some people want the best framerate per watt and will pay anything to get it, and to those people the cost wouldnt mean so much in the equation.


something like this?

Code:
// dividing by zero is bad
if( CostFactor == 0 || WattFactor == 0 )
    return;

// no negative values here, must add up to one
if( 1 != abs( CostFactor ) + abs( WattFactor ))
    return;

OverallScore = BenchmarkCompositeScore / ( ( abs(CostFactor) * cost ) * ( abs(WattFactor) * wattage ) );


....and I guess the units would be Performance/Watt-Dollars? P/WD? :D Or if you had someone who didn't care about cost/heat, it'd be P/W or P/D. I only care about P/W, really.

_________________
Fun: GOLDFISHY
Work: Video Surveillance


Top
 Profile  
 
 Post subject:
PostPosted: Mon Feb 13, 2006 5:40 pm 
Offline

Joined: Wed Sep 11, 2002 11:56 pm
Posts: 757
Location: EARTH.
pardon the semicolons, force of habit

_________________
Fun: GOLDFISHY
Work: Video Surveillance


Top
 Profile  
 
 Post subject:
PostPosted: Mon Feb 13, 2006 7:02 pm 
Offline

Joined: Wed Aug 10, 2005 7:57 pm
Posts: 51
Location: United States
From what I can understand of that code, it's a sort of percentage multiplier for wattage and cost? If so, why not have a percent for all three factors and then a person could rank which of the three are important to them, and associate percentage values to them. If something like this were implemented in some reasonable fashion, I think it could be applied to processors as well, no?


Top
 Profile  
 
 Post subject:
PostPosted: Tue Feb 14, 2006 8:05 am 
Offline

Joined: Mon Dec 15, 2003 10:29 am
Posts: 2299
Location: Bellevue, Nebraska
another thing to mabey add to it, is a way to set cutoffs. like if someone wants all cards between $100 and $200. or all cards under 50watts. and then have it list them within those perameters weighted by their preferences. and then have the output be able to be sorted by the different values so you can easily look at the output from several angles.

prices may be difficult as they always change. so rather than real world prices you could just put in the initial MSRP from either ati or nvidia for that card.

also, give the option to search by only nvidia cards, or only ati cards or both.

then throw it all up on a clean easy to read website so we can plug in the numbers and see what comes out :)


Top
 Profile  
 
 Post subject:
PostPosted: Tue Feb 14, 2006 10:58 am 
Offline

Joined: Wed Jan 18, 2006 4:57 am
Posts: 140
Location: France, Lyon
You have to put in the graph all the info needed by the reader. If price/power/performance are of use then you have to put the three.
3D graph are quiet hard to read, especially when you deal with discreet points rather than with continuous area (like altitude on a 3D geographical map that you can draw with different colours)
On a 2D graph, if you keep the 3 parameters you have to weight them. The exemple 'score/($ x W)' suggested, gives the same importance to price and power. Some people favour price, other power, so that's not a solution.
Narrasuj suggested a dynamic graph where the reader can weight himself between the 3 parameters. That's significantly harder to implement on the web than a simple jpeg graph.
A solution would be to put only 2 parameters in the graph, and keep the third out of it.

Aris wrote:
prices may be difficult as they always change. so rather than real world prices you could just put in the initial MSRP from either ati or nvidia for that card.

Good point, I remember seing a graph of processors rated according to their price and performance. The graph was one year old and was sadly totally useless as prices of Athlon, Duron, Pentium and Celeron got different price cuts during that time.
So puting the prices in the graph would need a regular update of the graph or it will become inaccurate after 6 months or so.

Performance and power consumption being the only 2 numbers staying constant over the life of the graphic card, they are probably the easier one to put in the graph. You can then insert under the graph the lower price of each card hotlinked from a price grabbing website.

But I keep writing and realize I have absolutely no will to do it myself, has anyone voluntereed yet or have we to choose a volunteer ? :lol:


Top
 Profile  
 
 Post subject:
PostPosted: Wed Feb 15, 2006 8:14 am 
Offline

Joined: Wed Sep 11, 2002 11:56 pm
Posts: 757
Location: EARTH.
Le_Gritche wrote:
You have to put in the graph all the info needed by the reader. If price/power/performance are of use then you have to put the three.


No, you only need the final output. Those three are all inputs--the only thing you'd be graphing is the composite score. So, a regular bar chart would work fine.

Quote:
Good point, I remember seing a graph of processors rated according to their price and performance. The graph was one year old and was sadly totally useless as prices of Athlon, Duron, Pentium and Celeron got different price cuts during that time.


This is also a good point--price isn't a constant, but wattage definitely is and performance (for the most part) doesn't change much.

Quote:
But I keep writing and realize I have absolutely no will to do it myself, has anyone voluntereed yet or have we to choose a volunteer ? :lol:


the buck does stop there, doesn't it? heh. :D

_________________
Fun: GOLDFISHY
Work: Video Surveillance


Top
 Profile  
 
 Post subject:
PostPosted: Wed Feb 15, 2006 8:36 am 
Offline

Joined: Tue Jun 21, 2005 12:45 pm
Posts: 62
Location: Sweden
Here's an overview chart of most of the numbers from X-bit labs (Both from the latest article and some cards from the earlier articles, sources can be found in the Excel-file):

Image

Here's the Excel file if you want to play with the figures:

http://www.wnd.se/htpc/Power-consumption-Graphics-cards.xls


Top
 Profile  
 
 Post subject:
PostPosted: Wed Feb 15, 2006 10:23 am 
Offline

Joined: Mon Dec 15, 2003 10:29 am
Posts: 2299
Location: Bellevue, Nebraska
just realized the fastest cards i can fit in my system are a 6600gt or x1600xt. x800xl is about 1/10" too long.

i just find it so hard to believe that the exact same card i bought over a year ago is still my best option from nvidia now. nvidia needs to hurry up and release the 7600gt. 6600gt is a good card, but theirs no way i'm going to build my next "upgrade" system and put the exact same video card in it.


Top
 Profile  
 
 Post subject:
PostPosted: Wed Feb 15, 2006 2:13 pm 
Offline

Joined: Wed Sep 11, 2002 11:56 pm
Posts: 757
Location: EARTH.
Image

Basically, I went through xbitlab's video card reviews, and yanked out the 3dmark05 scores. They all happened to be done on the exact same machine (some athlon64 4000+ rig), so it's pretty much apples to apples, assuming xbitlabs is consistent (they seem pretty "on the ball," so I'm assuming this is the case). In the xls files are hyperlinks to all the reviews that were referenced. I wouldn't mind some doublechecking--it was rather confusing to find and reference all of the values.

Two video cards, I couldn't find scores for.

The clear winner is the S3 card, which pains me a bit because I despise S3. Their driver and compatibility issues of days past are enough to leave a bad taste in anyone's mouth, but perhaps it's time to let it go and give it another try. Two of the ATI cards also had very good performance to wattage ratios (x1800 xl and x1600 xt), so those might be good candidates for those of us who aren't S3 fans. :P

For NVidia, the highest P/W cards were the 7800 GT and the 6600 DDR2, although neither had a very impressive showing next to the S3/ATI cards.

This is all with a synthetic benchmark, so it may very well be that real gaming performance yields quite different results. Maybe it should have FarCry/Doom3 numbers as well...

Modified excel file

_________________
Fun: GOLDFISHY
Work: Video Surveillance


Top
 Profile  
 
 Post subject:
PostPosted: Wed Feb 15, 2006 2:24 pm 
Offline

Joined: Tue Jun 21, 2005 12:45 pm
Posts: 62
Location: Sweden
Updated with some Performance per watt figures and chart. Sources can be found in the Excel-file, with comments on figures that deviates from the power consumption tests and benchmarks). I am not sure if I calculated correct so give me some input if you find anything not correct:

Image

The Excel file is updated, feel free to develop it, check the earlier link.


Top
 Profile  
 
 Post subject:
PostPosted: Wed Feb 15, 2006 2:28 pm 
Offline

Joined: Wed Sep 11, 2002 11:56 pm
Posts: 757
Location: EARTH.
wow--surprising difference between 05 and 06. Thanks

_________________
Fun: GOLDFISHY
Work: Video Surveillance


Top
 Profile  
 
 Post subject:
PostPosted: Wed Feb 15, 2006 3:15 pm 
Offline
*Lifetime Patron*

Joined: Mon May 23, 2005 7:05 am
Posts: 618
Location: State College, PA
impressive work guys - I was idly considering doing something similar so you've saved me the effort! Looks like a 6800 AGP is the best card for me (Skt A board) - it should comfortably whip the pants off my existing FX5200 :)


Top
 Profile  
 
 Post subject:
PostPosted: Wed Feb 15, 2006 5:03 pm 
Offline

Joined: Tue Jan 04, 2005 4:02 pm
Posts: 1608
Location: United States
Two very informative charts, nice work guys! You should definitely post these in the VGA card power dissipation sticky.

_________________
Corsair Obsidian 650D | Seasonic X-650 | Gigabyte GA-990FXA-UD5 | Phenom II X4 955 | Noctua NH-D14 | 2x4GB Corsair DDR3-1600 | ASUS HD6950 DirectCU II 2GB | OCZ Vertex 2 120GB | 2x WD Green 1TB


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 72 posts ]  Go to page 1, 2, 3  Next

All times are UTC - 8 hours


Who is online

Users browsing this forum: No registered users and 2 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group