HD 2900XT power consumption
Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee
HD 2900XT power consumption
According to this review at vr-zone, at load it draws 17W more than a 8800 GTX: http://www.vr-zone.com/?i=4946&s=9
There is a rumor that newer drivers (8.38 beta IIRC) are improving the performance, raising it close ot 8800 GTX level.
There is a rumor that newer drivers (8.38 beta IIRC) are improving the performance, raising it close ot 8800 GTX level.
Here is a link to the power consumption results at guru3d, it says 29W more than a 8800 Ultra:
http://www.guru3d.com/article/Videocards/431/13/
The cooler alone uses 25W when operating at 100%! The exhauted air must be scorching!
Oh, and the card is VERY noisy. So it's a nono for SPCR folk...
http://www.guru3d.com/article/Videocards/431/13/
The cooler alone uses 25W when operating at 100%! The exhauted air must be scorching!
Oh, and the card is VERY noisy. So it's a nono for SPCR folk...
Here you can see the Fujikura-made heatsink of the HD2900XT: http://www.vr-zone.com/?i=4971
So instead of a solid base it has a vapour chamber that contacts the heatpipes?
If the vapour chamber would have the heatpipes directly connected it would work better. But that is patented AFAIK.
So instead of a solid base it has a vapour chamber that contacts the heatpipes?
If the vapour chamber would have the heatpipes directly connected it would work better. But that is patented AFAIK.
Here's a table showing power consumption of the whole rig at the wall:
Red is load, grey is idle. The rig is X6800 @ 2.93, 2 GB RAM, 1 HDD, TrioPower 550. Go do your math
Here's what it sounds like:
Ambient is 42, red is load, grey is idle. Sure, these are just ballpark figures, and the 10 cm distance thing is plain funny, but it kinda confirms it's not a card for SPCR readers. As if power consumption didn't mean a thing
Pics courtesy of PCLab.pl
Red is load, grey is idle. The rig is X6800 @ 2.93, 2 GB RAM, 1 HDD, TrioPower 550. Go do your math
Here's what it sounds like:
Ambient is 42, red is load, grey is idle. Sure, these are just ballpark figures, and the 10 cm distance thing is plain funny, but it kinda confirms it's not a card for SPCR readers. As if power consumption didn't mean a thing
Pics courtesy of PCLab.pl
-
- *Lifetime Patron*
- Posts: 213
- Joined: Tue Jul 19, 2005 10:23 pm
- Location: California
- Contact:
http://www.xbitlabs.com/articles/video/ ... re_14.html
http://www.xbitlabs.com/misc/picture/?s ... ll.gif&1=1
Video card only, not the whole system: 71.7W at idle, 75.8W peak 2D, 161.1W peak 3D per Xbit labs.
By comparison an 8800GTX peak 3D is 131.5W.
http://www.xbitlabs.com/misc/picture/?s ... ll.gif&1=1
Video card only, not the whole system: 71.7W at idle, 75.8W peak 2D, 161.1W peak 3D per Xbit labs.
By comparison an 8800GTX peak 3D is 131.5W.
Rumor is that the 80 nm process at TSMC is leaky, and some HD2900XT chips are even leakier than others.
Which could mean that one may be lucky to draw only 20W more than a 8800GTX, or unlucky with 40W more...
Not related to heat and noise, I believe that in order to improve performance the shaders for each game - and not just standard driver parts - will have to be rewritten.
The reason is that the shader performance is differently balanced compared to the 8800 GTX, so if the existing games are using shaders optimized for 8800 GTX, then they will underperform on HD2900XT. Of course, this may lead to image differences between the two cards.
Which could mean that one may be lucky to draw only 20W more than a 8800GTX, or unlucky with 40W more...
Not related to heat and noise, I believe that in order to improve performance the shaders for each game - and not just standard driver parts - will have to be rewritten.
The reason is that the shader performance is differently balanced compared to the 8800 GTX, so if the existing games are using shaders optimized for 8800 GTX, then they will underperform on HD2900XT. Of course, this may lead to image differences between the two cards.