New ATI X1800 series heat level (power draw)

They make noise, too.

Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee

afrost
Posts: 141
Joined: Thu May 01, 2003 9:42 am

New ATI X1800 series heat level (power draw)

Post by afrost » Wed Oct 05, 2005 7:55 am

Hey guys, I was wondering if anybody has seen anything regarding the heat level of the X1800 cards vs. the 7800 cards from nVidia.

It seems like the heat output from the 7800GT is pretty reasonable. I am kind of an ATI fan and would lean towards buying an X1800XL if it were the same price....but I will buy which ever one puts out less heat if there is a significant difference.

I don't know why nVidia can make a relatively cool running GPU but the nForce 4 NB is so damn hot......I am tired of waiting for crossfire boards to come out so I can get stock passive NB cooling (no heatpipes needed).

rpsgc
Friend of SPCR
Posts: 1630
Joined: Tue Oct 05, 2004 1:59 am
Location: Portugal

Post by rpsgc » Wed Oct 05, 2005 8:14 am

A picture is worth a thousand words:
Image
Testbed:
Athlon 64 X2 4800+
ATI XFire reference board (for ATI cards)
OCZ EL 1GB DDR400
Maxtor DiamondMax 10 250GB

We have a new power-hog king :P

Aris
Posts: 2299
Joined: Mon Dec 15, 2003 10:29 am
Location: Bellevue, Nebraska
Contact:

Post by Aris » Wed Oct 05, 2005 8:36 am

thats the icing on the cake for me.

looks like my next video card upgrade will be a 7800gt.


ati has really let me down with their new generation of video cards. they come late, they paperlaunch with still nothing showing up for them on pricewatch, their priced more than their respective competition, and now they run alot hotter than their respective competition.

slipknottin
Posts: 235
Joined: Tue Jan 18, 2005 7:55 pm

Post by slipknottin » Wed Oct 05, 2005 8:43 am

ATI's cards got spanked pretty badly in the performance test over on anandtech as well. Looks like a loser for ATI.

lenny
Patron of SPCR
Posts: 1642
Joined: Wed May 28, 2003 10:50 am
Location: Somewhere out there

Post by lenny » Wed Oct 05, 2005 8:54 am

slipknottin wrote:ATI's cards got spanked pretty badly in the performance test over on anandtech as well. Looks like a loser for ATI.
Quietly reducing warranty to 1 year (from 3 years), and requiring you to register your product within 30 days of purchase to get warranty service, does not endear them to my heart either, esp. since I just bought a new card a few days after their new requirement came into effect.

rpsgc
Friend of SPCR
Posts: 1630
Joined: Tue Oct 05, 2004 1:59 am
Location: Portugal

Post by rpsgc » Wed Oct 05, 2005 8:55 am

Maybe the R580 will be better :roll:

frostedflakes
Posts: 1608
Joined: Tue Jan 04, 2005 4:02 pm
Location: United States

Post by frostedflakes » Wed Oct 05, 2005 9:53 am

Yup, I'm really disappointed as well. Lackluster performance and higher power consumption than G70... *sigh* I had high hopes for R520.

Maybe S3 will wow us with Chrome20, I think it's supposed to be available in the next month or two. And nVidia has been pretty solid lately, so hopefully they will repeat the G40 -> G70 trend with G80 (i.e. better performance, lower power consumption).

stromgald
Posts: 887
Joined: Wed Jun 09, 2004 12:45 pm
Location: California, US

Post by stromgald » Wed Oct 05, 2005 10:04 am

Well, hopefully they'll keep selling cards somewhere if not to SPCR enthusiasts. They'll need more money to catch up with nVidia, especially with the way they're dropping. :(

Doomer
Posts: 69
Joined: Mon Dec 23, 2002 11:44 pm
Location: Finland

Post by Doomer » Wed Oct 05, 2005 10:29 am

Funny how fast things change. It wasn't that long ago when Nvidia had mediocre performance and very high power consumption. I've understood that R520 was designed at Ati's eastcoast office(Marlborough), while the Artx team that made the wonderful R300 are designing the R600.

rpsgc
Friend of SPCR
Posts: 1630
Joined: Tue Oct 05, 2004 1:59 am
Location: Portugal

Post by rpsgc » Wed Oct 05, 2005 11:02 am

And nVIDIA is going to launch the G72 at the beginning of 2006, before the G80. That'll be 90nm parts. Parhaps the mid/low-end cards? Looking good nVIDIA.

frostedflakes
Posts: 1608
Joined: Tue Jan 04, 2005 4:02 pm
Location: United States

Post by frostedflakes » Wed Oct 05, 2005 11:53 am

stromgald wrote:Well, hopefully they'll keep selling cards somewhere if not to SPCR enthusiasts. They'll need more money to catch up with nVidia, especially with the way they're dropping. :(
To who, I wonder. Right now, R520 doesn't seem to have anything good going for it. I doubt it will appeal to the hardcore gamers and overclockers, because performance is less than spectacular compared to what nVidia currently has. From a quiet PC standpoint, a 7x00 series card would be a better choice because it has higher performance per watt. Heck, they don't even have the price advantage on nVidia! All that really leaves is the fanboys, who are dropping like flies thanks to ATi's paper launches and unannounced reduced warranties. It's really kind of sad.[/rant]

As for G72, is that basically a G70 shrunk to 90nm? If so, sounds really interesting. Are they going to reduce the # of physical pipes and make it a midrange/low-end only GPU (kind of like the 6600s), or will it have as many pipes as G70 and be used for all their 7x00 series cards? Either way it I'd hope it would offer very good performance per watt at a low price, which is good.

rpsgc
Friend of SPCR
Posts: 1630
Joined: Tue Oct 05, 2004 1:59 am
Location: Portugal

Post by rpsgc » Wed Oct 05, 2005 12:29 pm

frostedflakes wrote:As for G72, is that basically a G70 shrunk to 90nm? If so, sounds really interesting. Are they going to reduce the # of physical pipes and make it a midrange/low-end only GPU (kind of like the 6600s), or will it have as many pipes as G70 and be used for all their 7x00 series cards? Either way it I'd hope it would offer very good performance per watt at a low price, which is good.
Source: Anandtech
I thinks it's really a new chip.
The refresh for NVIDIA's G70 series of GPUs is due to launch early next year. The new GPU will be built on a .90nm manufacturing process (like the ATI R5xx chips) and will offer support for SLI on a single card

Tzupy
*Lifetime Patron*
Posts: 1561
Joined: Wed Jan 12, 2005 10:47 am
Location: Bucharest, Romania

Post by Tzupy » Wed Oct 05, 2005 12:55 pm

I was expecting the X1800 with 512 MB to dissipate upto 120W. From the techreport diagram it seems only ~105W at full load. But it's unclear if the power is for the 256 MB or 512 MB card...
What one has to remember is that when not gaming, the X1800 dissipates 44W more than a 7800 GTX (and 61W more than a 7800 GT), so it could make your PSU ramp up while just reading SPCR articles ;-) .

afrost
Posts: 141
Joined: Thu May 01, 2003 9:42 am

Post by afrost » Wed Oct 05, 2005 1:33 pm

Thanks for the info!

Looks like a 7800GT for me.

As far as performance the review at HardOCP showed that the 7800GT and X1800XL pretty much traded blows for performance and the ATI card has a couple of extra options for image quality. If they were the same price and heat I would grab the ATI.

Looks like I'm switching to the green side tho....

rpsgc
Friend of SPCR
Posts: 1630
Joined: Tue Oct 05, 2004 1:59 am
Location: Portugal

Post by rpsgc » Wed Oct 05, 2005 1:57 pm

frostedflakes: I saw some info on the G72, it's nothing official but it seems that it's the "7600" with 12 pipelines, and there's also going to be a G74(?) with 8 pipelines --> "7200".

Once again, nothing official.

dragmor
Posts: 301
Joined: Sun Jul 10, 2005 7:54 pm
Location: Oz

Post by dragmor » Wed Oct 05, 2005 3:37 pm

The low level X1300 cards look good for a 2D desktop. 1xDual Link DVI and 1xSingle Link DVI conector, hopefully passively cooled.

Pgh
Posts: 176
Joined: Sat May 24, 2003 6:25 pm

Post by Pgh » Wed Oct 05, 2005 5:13 pm

dragmor wrote:The low level X1300 cards look good for a 2D desktop. 1xDual Link DVI and 1xSingle Link DVI conector, hopefully passively cooled.
There were stories last week at Hexus and the Inquirer that all the new ATI cards would have 2 dual link dvi ports. Did you find a review somewhere that says the low end cards have a single link port?

jojo4u
Posts: 806
Joined: Sat Dec 14, 2002 7:00 am
Location: Germany

Post by jojo4u » Wed Oct 05, 2005 5:15 pm

Looks like ATI either suffers from the leakage current of the 90nm process or hasn't implemented power-saving correctly yet. *sigh*

EDIT: here you can find X1300 power consumption figures. Idle is pretty bad as well.
http://www.computerbase.de/artikel/hard ... mverbrauch

anandtech also has figures of all 3 series.
http://www.anandtech.com/video/showdoc.aspx?i=2552&p=7
Last edited by jojo4u on Wed Oct 05, 2005 5:47 pm, edited 1 time in total.

dragmor
Posts: 301
Joined: Sun Jul 10, 2005 7:54 pm
Location: Oz

Post by dragmor » Wed Oct 05, 2005 5:31 pm

Pgh wrote:
dragmor wrote:The low level X1300 cards look good for a 2D desktop. 1xDual Link DVI and 1xSingle Link DVI conector, hopefully passively cooled.
There were stories last week at Hexus and the Inquirer that all the new ATI cards would have 2 dual link dvi ports. Did you find a review somewhere that says the low end cards have a single link port?
Yeah I had the 2 dual link, then edited it to 1 dual 1 single after reading anandtech's AVIVO article. Apparently only the X1800 has 2xDual. I think this is a big mistake on ATI's part.

http://www.anandtech.com/video/showdoc.aspx?i=2551
The Radeon X1800 series will support up to two dual-link DVI ports, while the X1600 and X1300 will support up to one dual-link and one single-link port.

jojo4u
Posts: 806
Joined: Sat Dec 14, 2002 7:00 am
Location: Germany

Post by jojo4u » Wed Oct 05, 2005 5:36 pm

frostedflake wrote:To who, I wonder. Right now, R520 doesn't seem to have anything good going for it. I doubt it will appeal to the hardcore gamers and overclockers, because performance is less than spectacular compared to what nVidia currently has.
Well... it depends on you point of view. Let's put aside the power consumption (shame on you, ATI!) and look computerbase.de. They activated the "high quality" option in the Forceware which is comparable in quality to "A.I.-low". With AA+AF enabled the Radeon X1800XT is in front now. Todays maintstream and high-end cards have enough power for AA+AF. Nobody with these cards should disable it.
The picture quality is better with ATI as well. More efficient 6x AA, better looking transparent AA, AA+HDR and angle-independend AF. The last one is something many WoW players have been waiting for.
ATI is also faster in shader performance which will be more important over time.
Please don't use anandtech if you want to have an english reference. They again fail to use AA+AF in every benchmark as default and only cause confusion with the textual information about AA+AF performance. techreport.com seems to have a pretty nice review.

http://www.computerbase.de/artikel/hard ... benchmarks
http://techreport.com/reviews/2005q4/ra ... dex.x?pg=1
Last edited by jojo4u on Wed Oct 05, 2005 5:46 pm, edited 1 time in total.

mathias
Posts: 2057
Joined: Sun Jul 11, 2004 3:58 pm
Location: Toronto
Contact:

Post by mathias » Wed Oct 05, 2005 5:41 pm

jojo4u wrote:I hope nobody is stupid enough to play without AA+AF nowadays.
I am.

jojo4u
Posts: 806
Joined: Sat Dec 14, 2002 7:00 am
Location: Germany

Post by jojo4u » Wed Oct 05, 2005 5:49 pm

mathias wrote:
jojo4u wrote:I hope nobody is stupid enough to play without AA+AF nowadays.
I am.
Ok you got me. Corrected. I thought about writing it less polemic orignally but failed to to so.
For example, I have a Radeon 9700 which is a bit underpowered for 6x AA ;)

Pgh
Posts: 176
Joined: Sat May 24, 2003 6:25 pm

Post by Pgh » Wed Oct 05, 2005 5:53 pm

dragmor wrote:...
Yeah I had the 2 dual link, then edited it to 1 dual 1 single after reading anandtech's AVIVO article. Apparently only the X1800 has 2xDual. I think this is a big mistake on ATI's part.

http://www.anandtech.com/video/showdoc.aspx?i=2551
The Radeon X1800 series will support up to two dual-link DVI ports, while the X1600 and X1300 will support up to one dual-link and one single-link port.
I think ATI probably had economic reasons for dropping the second dual link port on their low end cards. The number of people who actually need 2 DVI-DL ports has got to be very small... after all, how many people can afford two 30" Cinema Displays? Those that can will probably be able afford the few hundred dollars extra for the X1800. And when you consider that the next cheapest alternative is a $1500 Quadro, the price of the X1800 seems cheap.

Slaugh
Posts: 774
Joined: Thu Dec 02, 2004 2:27 am
Location: Quebec, Canada

Post by Slaugh » Wed Oct 05, 2005 6:10 pm

There's a preview of the X1800 XT and X1800 XL on ExtremeTech with benchmarks. Both X1800 were tested against the GeForce 7800GT and 7800GTX...

But the biggest challenge will be to find a way to silence these beasts...

merlin
Friend of SPCR
Posts: 717
Joined: Mon Jun 13, 2005 6:48 am
Location: San Francisco, CA

Post by merlin » Wed Oct 05, 2005 7:32 pm

After looking over some reviews, I'm rather disappointed by most of what ATI has put out. In all honesty the best product they have is the x1300, it blows the old low end cards out of the water and would be an awesome solution for anyone who doesn't really need much 3d. It looks like it'll be a very quiet and passive card too. Now that assumes the prices for it go down into the 60-80 range relatively soon.

I'm highly disappointed in the x1600 though, it's barely able to surpass the 6600gt and sometimes even loses! That's rather sad for a card that's supposed to be a 12 pipe design with 90mhz more than the 6600gt and far more memory bandwidth. Not to mention the power consumption is very high from that chart. It's giving me memories of the Geforce fx5800/5600... (which barely beat the TI4x00 series)

The x1800 looks like it'll be horribly annoying to keep cool, but at least the performance is up to speed with the competition!

I think all this has done is make me even more interested in when Nvidia's 90nm design comes out. (which will hopefully be the Geforce 7600....the next mid-range silent computing savior I hope!).

Tzupy
*Lifetime Patron*
Posts: 1561
Joined: Wed Jan 12, 2005 10:47 am
Location: Bucharest, Romania

Post by Tzupy » Thu Oct 06, 2005 9:20 am

According to xbitlabs, the X1800 XT tops at 112W, and this is for the 512 MB version. Someone rich should establish a prize for the passive cooling of this beast ':lol:'

~El~Jefe~
Friend of SPCR
Posts: 2887
Joined: Mon Feb 28, 2005 4:21 pm
Location: New York City zzzz
Contact:

Post by ~El~Jefe~ » Thu Oct 06, 2005 11:13 am

these wattages are based upon inefficient power supplies. also, the more the gfx card processes, the more the cpu has to push. cpu's are being maxed out by the gfx cards sine 7800GT came on the scene.

i dont trust numbers until MikeC tests them.

However, you can judge relative use of power, the numbers really do not mean much.
7800GT seems to be best. however, hear the image quality is below that ati series from 9800 to the x1800XT.

quality is everything to me. i like my eyes.

Framerates, powerdraw, shaders, all of that is crap if the picture looks fuzzy. I have a matrox 550 pci card in my machine that I switch to for 2d work, photos, writing my book, etc. the clarity is better, even compared to the 9800pro which had a higher rating than nvidia for a while. .... so i wonder about clarity, but no one tests that.

dragmor
Posts: 301
Joined: Sun Jul 10, 2005 7:54 pm
Location: Oz

Post by dragmor » Thu Oct 06, 2005 2:58 pm

~El~Jefe~ wrote:these wattages are based upon inefficient power supplies.
xbit has wattages measured from the 12v and 3.3v going to the PCIe 16 slot
http://www.xbitlabs.com/articles/video/ ... 00_14.html

frostedflakes
Posts: 1608
Joined: Tue Jan 04, 2005 4:02 pm
Location: United States

Post by frostedflakes » Thu Oct 06, 2005 5:54 pm

~El~Jefe~ wrote:these wattages are based upon inefficient power supplies. also, the more the gfx card processes, the more the cpu has to push. cpu's are being maxed out by the gfx cards sine 7800GT came on the scene.

i dont trust numbers until MikeC tests them.

However, you can judge relative use of power, the numbers really do not mean much.
7800GT seems to be best. however, hear the image quality is below that ati series from 9800 to the x1800XT.

quality is everything to me. i like my eyes.

Framerates, powerdraw, shaders, all of that is crap if the picture looks fuzzy. I have a matrox 550 pci card in my machine that I switch to for 2d work, photos, writing my book, etc. the clarity is better, even compared to the 9800pro which had a higher rating than nvidia for a while. .... so i wonder about clarity, but no one tests that.
Someone correct me if I'm wrong, but from what I understand, IQ is not so much of a problem with DVI. Whereas quality varies a lot with analog VGA output from card to card, there is not so much variation with DVI.

So I don't think that there would really be much difference in IQ between ATi and nVidia when using DVI output on a true DVI monitor.

~El~Jefe~
Friend of SPCR
Posts: 2887
Joined: Mon Feb 28, 2005 4:21 pm
Location: New York City zzzz
Contact:

Post by ~El~Jefe~ » Thu Oct 06, 2005 8:34 pm

dragmor wrote:
~El~Jefe~ wrote:these wattages are based upon inefficient power supplies.
xbit has wattages measured from the 12v and 3.3v going to the PCIe 16 slot
http://www.xbitlabs.com/articles/video/ ... 00_14.html
what I mean is this:

they didnt isolate the cards from the main power draw. AND, they didnt tell you the psu they used. a 74% efficient psu at that draw will be alot more noticable when you arent at idle in a game than say, my 88% phantom 350.

some gfx cards with some games will cause the cpu to get bogged down more. its really hard to say that one gfx card actually uses more wattage of this level than that one. relative wattage usage does of course, in my mind, make more sense in looking at both spcr-like quality of a product and psu recommendations.

Post Reply