New ATI X1800 series heat level (power draw)
Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee
New ATI X1800 series heat level (power draw)
Hey guys, I was wondering if anybody has seen anything regarding the heat level of the X1800 cards vs. the 7800 cards from nVidia.
It seems like the heat output from the 7800GT is pretty reasonable. I am kind of an ATI fan and would lean towards buying an X1800XL if it were the same price....but I will buy which ever one puts out less heat if there is a significant difference.
I don't know why nVidia can make a relatively cool running GPU but the nForce 4 NB is so damn hot......I am tired of waiting for crossfire boards to come out so I can get stock passive NB cooling (no heatpipes needed).
It seems like the heat output from the 7800GT is pretty reasonable. I am kind of an ATI fan and would lean towards buying an X1800XL if it were the same price....but I will buy which ever one puts out less heat if there is a significant difference.
I don't know why nVidia can make a relatively cool running GPU but the nForce 4 NB is so damn hot......I am tired of waiting for crossfire boards to come out so I can get stock passive NB cooling (no heatpipes needed).
thats the icing on the cake for me.
looks like my next video card upgrade will be a 7800gt.
ati has really let me down with their new generation of video cards. they come late, they paperlaunch with still nothing showing up for them on pricewatch, their priced more than their respective competition, and now they run alot hotter than their respective competition.
looks like my next video card upgrade will be a 7800gt.
ati has really let me down with their new generation of video cards. they come late, they paperlaunch with still nothing showing up for them on pricewatch, their priced more than their respective competition, and now they run alot hotter than their respective competition.
-
- Posts: 235
- Joined: Tue Jan 18, 2005 7:55 pm
Quietly reducing warranty to 1 year (from 3 years), and requiring you to register your product within 30 days of purchase to get warranty service, does not endear them to my heart either, esp. since I just bought a new card a few days after their new requirement came into effect.slipknottin wrote:ATI's cards got spanked pretty badly in the performance test over on anandtech as well. Looks like a loser for ATI.
-
- Posts: 1608
- Joined: Tue Jan 04, 2005 4:02 pm
- Location: United States
Yup, I'm really disappointed as well. Lackluster performance and higher power consumption than G70... *sigh* I had high hopes for R520.
Maybe S3 will wow us with Chrome20, I think it's supposed to be available in the next month or two. And nVidia has been pretty solid lately, so hopefully they will repeat the G40 -> G70 trend with G80 (i.e. better performance, lower power consumption).
Maybe S3 will wow us with Chrome20, I think it's supposed to be available in the next month or two. And nVidia has been pretty solid lately, so hopefully they will repeat the G40 -> G70 trend with G80 (i.e. better performance, lower power consumption).
-
- Posts: 1608
- Joined: Tue Jan 04, 2005 4:02 pm
- Location: United States
To who, I wonder. Right now, R520 doesn't seem to have anything good going for it. I doubt it will appeal to the hardcore gamers and overclockers, because performance is less than spectacular compared to what nVidia currently has. From a quiet PC standpoint, a 7x00 series card would be a better choice because it has higher performance per watt. Heck, they don't even have the price advantage on nVidia! All that really leaves is the fanboys, who are dropping like flies thanks to ATi's paper launches and unannounced reduced warranties. It's really kind of sad.[/rant]stromgald wrote:Well, hopefully they'll keep selling cards somewhere if not to SPCR enthusiasts. They'll need more money to catch up with nVidia, especially with the way they're dropping.
As for G72, is that basically a G70 shrunk to 90nm? If so, sounds really interesting. Are they going to reduce the # of physical pipes and make it a midrange/low-end only GPU (kind of like the 6600s), or will it have as many pipes as G70 and be used for all their 7x00 series cards? Either way it I'd hope it would offer very good performance per watt at a low price, which is good.
Source: Anandtechfrostedflakes wrote:As for G72, is that basically a G70 shrunk to 90nm? If so, sounds really interesting. Are they going to reduce the # of physical pipes and make it a midrange/low-end only GPU (kind of like the 6600s), or will it have as many pipes as G70 and be used for all their 7x00 series cards? Either way it I'd hope it would offer very good performance per watt at a low price, which is good.
I thinks it's really a new chip.
The refresh for NVIDIA's G70 series of GPUs is due to launch early next year. The new GPU will be built on a .90nm manufacturing process (like the ATI R5xx chips) and will offer support for SLI on a single card
I was expecting the X1800 with 512 MB to dissipate upto 120W. From the techreport diagram it seems only ~105W at full load. But it's unclear if the power is for the 256 MB or 512 MB card...
What one has to remember is that when not gaming, the X1800 dissipates 44W more than a 7800 GTX (and 61W more than a 7800 GT), so it could make your PSU ramp up while just reading SPCR articles .
What one has to remember is that when not gaming, the X1800 dissipates 44W more than a 7800 GTX (and 61W more than a 7800 GT), so it could make your PSU ramp up while just reading SPCR articles .
Thanks for the info!
Looks like a 7800GT for me.
As far as performance the review at HardOCP showed that the 7800GT and X1800XL pretty much traded blows for performance and the ATI card has a couple of extra options for image quality. If they were the same price and heat I would grab the ATI.
Looks like I'm switching to the green side tho....
Looks like a 7800GT for me.
As far as performance the review at HardOCP showed that the 7800GT and X1800XL pretty much traded blows for performance and the ATI card has a couple of extra options for image quality. If they were the same price and heat I would grab the ATI.
Looks like I'm switching to the green side tho....
There were stories last week at Hexus and the Inquirer that all the new ATI cards would have 2 dual link dvi ports. Did you find a review somewhere that says the low end cards have a single link port?dragmor wrote:The low level X1300 cards look good for a 2D desktop. 1xDual Link DVI and 1xSingle Link DVI conector, hopefully passively cooled.
Looks like ATI either suffers from the leakage current of the 90nm process or hasn't implemented power-saving correctly yet. *sigh*
EDIT: here you can find X1300 power consumption figures. Idle is pretty bad as well.
http://www.computerbase.de/artikel/hard ... mverbrauch
anandtech also has figures of all 3 series.
http://www.anandtech.com/video/showdoc.aspx?i=2552&p=7
EDIT: here you can find X1300 power consumption figures. Idle is pretty bad as well.
http://www.computerbase.de/artikel/hard ... mverbrauch
anandtech also has figures of all 3 series.
http://www.anandtech.com/video/showdoc.aspx?i=2552&p=7
Last edited by jojo4u on Wed Oct 05, 2005 5:47 pm, edited 1 time in total.
Yeah I had the 2 dual link, then edited it to 1 dual 1 single after reading anandtech's AVIVO article. Apparently only the X1800 has 2xDual. I think this is a big mistake on ATI's part.Pgh wrote:There were stories last week at Hexus and the Inquirer that all the new ATI cards would have 2 dual link dvi ports. Did you find a review somewhere that says the low end cards have a single link port?dragmor wrote:The low level X1300 cards look good for a 2D desktop. 1xDual Link DVI and 1xSingle Link DVI conector, hopefully passively cooled.
http://www.anandtech.com/video/showdoc.aspx?i=2551
The Radeon X1800 series will support up to two dual-link DVI ports, while the X1600 and X1300 will support up to one dual-link and one single-link port.
Well... it depends on you point of view. Let's put aside the power consumption (shame on you, ATI!) and look computerbase.de. They activated the "high quality" option in the Forceware which is comparable in quality to "A.I.-low". With AA+AF enabled the Radeon X1800XT is in front now. Todays maintstream and high-end cards have enough power for AA+AF. Nobody with these cards should disable it.frostedflake wrote:To who, I wonder. Right now, R520 doesn't seem to have anything good going for it. I doubt it will appeal to the hardcore gamers and overclockers, because performance is less than spectacular compared to what nVidia currently has.
The picture quality is better with ATI as well. More efficient 6x AA, better looking transparent AA, AA+HDR and angle-independend AF. The last one is something many WoW players have been waiting for.
ATI is also faster in shader performance which will be more important over time.
Please don't use anandtech if you want to have an english reference. They again fail to use AA+AF in every benchmark as default and only cause confusion with the textual information about AA+AF performance. techreport.com seems to have a pretty nice review.
http://www.computerbase.de/artikel/hard ... benchmarks
http://techreport.com/reviews/2005q4/ra ... dex.x?pg=1
Last edited by jojo4u on Wed Oct 05, 2005 5:46 pm, edited 1 time in total.
I think ATI probably had economic reasons for dropping the second dual link port on their low end cards. The number of people who actually need 2 DVI-DL ports has got to be very small... after all, how many people can afford two 30" Cinema Displays? Those that can will probably be able afford the few hundred dollars extra for the X1800. And when you consider that the next cheapest alternative is a $1500 Quadro, the price of the X1800 seems cheap.dragmor wrote:...
Yeah I had the 2 dual link, then edited it to 1 dual 1 single after reading anandtech's AVIVO article. Apparently only the X1800 has 2xDual. I think this is a big mistake on ATI's part.
http://www.anandtech.com/video/showdoc.aspx?i=2551The Radeon X1800 series will support up to two dual-link DVI ports, while the X1600 and X1300 will support up to one dual-link and one single-link port.
There's a preview of the X1800 XT and X1800 XL on ExtremeTech with benchmarks. Both X1800 were tested against the GeForce 7800GT and 7800GTX...
But the biggest challenge will be to find a way to silence these beasts...
But the biggest challenge will be to find a way to silence these beasts...
After looking over some reviews, I'm rather disappointed by most of what ATI has put out. In all honesty the best product they have is the x1300, it blows the old low end cards out of the water and would be an awesome solution for anyone who doesn't really need much 3d. It looks like it'll be a very quiet and passive card too. Now that assumes the prices for it go down into the 60-80 range relatively soon.
I'm highly disappointed in the x1600 though, it's barely able to surpass the 6600gt and sometimes even loses! That's rather sad for a card that's supposed to be a 12 pipe design with 90mhz more than the 6600gt and far more memory bandwidth. Not to mention the power consumption is very high from that chart. It's giving me memories of the Geforce fx5800/5600... (which barely beat the TI4x00 series)
The x1800 looks like it'll be horribly annoying to keep cool, but at least the performance is up to speed with the competition!
I think all this has done is make me even more interested in when Nvidia's 90nm design comes out. (which will hopefully be the Geforce 7600....the next mid-range silent computing savior I hope!).
I'm highly disappointed in the x1600 though, it's barely able to surpass the 6600gt and sometimes even loses! That's rather sad for a card that's supposed to be a 12 pipe design with 90mhz more than the 6600gt and far more memory bandwidth. Not to mention the power consumption is very high from that chart. It's giving me memories of the Geforce fx5800/5600... (which barely beat the TI4x00 series)
The x1800 looks like it'll be horribly annoying to keep cool, but at least the performance is up to speed with the competition!
I think all this has done is make me even more interested in when Nvidia's 90nm design comes out. (which will hopefully be the Geforce 7600....the next mid-range silent computing savior I hope!).
-
- Friend of SPCR
- Posts: 2887
- Joined: Mon Feb 28, 2005 4:21 pm
- Location: New York City zzzz
- Contact:
these wattages are based upon inefficient power supplies. also, the more the gfx card processes, the more the cpu has to push. cpu's are being maxed out by the gfx cards sine 7800GT came on the scene.
i dont trust numbers until MikeC tests them.
However, you can judge relative use of power, the numbers really do not mean much.
7800GT seems to be best. however, hear the image quality is below that ati series from 9800 to the x1800XT.
quality is everything to me. i like my eyes.
Framerates, powerdraw, shaders, all of that is crap if the picture looks fuzzy. I have a matrox 550 pci card in my machine that I switch to for 2d work, photos, writing my book, etc. the clarity is better, even compared to the 9800pro which had a higher rating than nvidia for a while. .... so i wonder about clarity, but no one tests that.
i dont trust numbers until MikeC tests them.
However, you can judge relative use of power, the numbers really do not mean much.
7800GT seems to be best. however, hear the image quality is below that ati series from 9800 to the x1800XT.
quality is everything to me. i like my eyes.
Framerates, powerdraw, shaders, all of that is crap if the picture looks fuzzy. I have a matrox 550 pci card in my machine that I switch to for 2d work, photos, writing my book, etc. the clarity is better, even compared to the 9800pro which had a higher rating than nvidia for a while. .... so i wonder about clarity, but no one tests that.
xbit has wattages measured from the 12v and 3.3v going to the PCIe 16 slot~El~Jefe~ wrote:these wattages are based upon inefficient power supplies.
http://www.xbitlabs.com/articles/video/ ... 00_14.html
-
- Posts: 1608
- Joined: Tue Jan 04, 2005 4:02 pm
- Location: United States
Someone correct me if I'm wrong, but from what I understand, IQ is not so much of a problem with DVI. Whereas quality varies a lot with analog VGA output from card to card, there is not so much variation with DVI.~El~Jefe~ wrote:these wattages are based upon inefficient power supplies. also, the more the gfx card processes, the more the cpu has to push. cpu's are being maxed out by the gfx cards sine 7800GT came on the scene.
i dont trust numbers until MikeC tests them.
However, you can judge relative use of power, the numbers really do not mean much.
7800GT seems to be best. however, hear the image quality is below that ati series from 9800 to the x1800XT.
quality is everything to me. i like my eyes.
Framerates, powerdraw, shaders, all of that is crap if the picture looks fuzzy. I have a matrox 550 pci card in my machine that I switch to for 2d work, photos, writing my book, etc. the clarity is better, even compared to the 9800pro which had a higher rating than nvidia for a while. .... so i wonder about clarity, but no one tests that.
So I don't think that there would really be much difference in IQ between ATi and nVidia when using DVI output on a true DVI monitor.
-
- Friend of SPCR
- Posts: 2887
- Joined: Mon Feb 28, 2005 4:21 pm
- Location: New York City zzzz
- Contact:
what I mean is this:dragmor wrote:xbit has wattages measured from the 12v and 3.3v going to the PCIe 16 slot~El~Jefe~ wrote:these wattages are based upon inefficient power supplies.
http://www.xbitlabs.com/articles/video/ ... 00_14.html
they didnt isolate the cards from the main power draw. AND, they didnt tell you the psu they used. a 74% efficient psu at that draw will be alot more noticable when you arent at idle in a game than say, my 88% phantom 350.
some gfx cards with some games will cause the cpu to get bogged down more. its really hard to say that one gfx card actually uses more wattage of this level than that one. relative wattage usage does of course, in my mind, make more sense in looking at both spcr-like quality of a product and psu recommendations.