ATI's answer to NVidia 8800: R600

They make noise, too.

Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee

rickster
Posts: 14
Joined: Thu Dec 07, 2006 4:40 pm

Post by rickster » Fri Mar 02, 2007 1:22 am

hard ocp says the r600 will draw ~300w

good luck trying to make a silent PC with that kind of heat

Felger Carbon
Posts: 2049
Joined: Thu Dec 15, 2005 11:06 am
Location: Klamath Falls, OR

Post by Felger Carbon » Fri Mar 02, 2007 8:47 am

According to Madshrimps today, the R600 single card will pull 300W +/- 9%! The two-card "crossfire" double that. Gonna make for some warm and cheery winter evenings... next winter, that is, on account of it's still late late late. :oops:

qviri
Posts: 2465
Joined: Tue May 24, 2005 8:22 pm
Location: Berlin
Contact:

Post by qviri » Fri Mar 02, 2007 1:38 pm

qviri wrote:hai guys, remember that 8800 GTX that was supposed to draw 300 watts?

rickster
Posts: 14
Joined: Thu Dec 07, 2006 4:40 pm

Post by rickster » Fri Mar 02, 2007 6:30 pm

qviri wrote:
qviri wrote:hai guys, remember that 8800 GTX that was supposed to draw 300 watts?
hai

were the numbers about the power draw straight from nvidia?

hard ocp says

"I am not using Dailytech as a source, but rather ATI documentation. And by the way, I own this site and reported the information. Dailytech and the Inq report as much right as they do wrong, so using them to hold up your arguments does not mean much to me."

Do you not trust ATI documentation?

nutball
*Lifetime Patron*
Posts: 1304
Joined: Thu Apr 10, 2003 7:16 am
Location: en.gb.uk

Post by nutball » Sat Mar 03, 2007 12:00 am

rickster wrote:Do you not trust ATI documentation?
Depends what the documentation actually says, and how badly it's been misinterpreted in the writing of that article.

~El~Jefe~
Friend of SPCR
Posts: 2887
Joined: Mon Feb 28, 2005 4:21 pm
Location: New York City zzzz
Contact:

Post by ~El~Jefe~ » Sun Mar 04, 2007 8:36 pm

it wont pull that much :) I have seen on inquirer.net 240 watts for a long time now.

supposedly it is asking for 240 watts as well.... That means if you have a crappy, inefficient psu and a hot case with poor front air intakes for negative pressure, youll see 240 watts possibly sucked up. I bet it is like 180 watts on max go in an efficient and well cooled system. It would be death to the company to run near 600 watts. not many would want it truthfully. another 100 watts to your gaming is acceptable to most, but needing a 900+ watt rated psu is something not many have or would bother getting and installing.

andyb
Patron of SPCR
Posts: 3307
Joined: Wed Dec 15, 2004 12:00 pm
Location: Essex, England

Post by andyb » Fri Mar 16, 2007 1:27 pm

I sure as hell hope that the info in the link below is true.

The Inquirer have come up with some brilliant and stunning information in the past, and are right far more than they are wrong, especially with leaks. I am hoping on this one.

http://www.theinquirer.net/default.aspx?article=38292

It could also explain why there are 2 PCB revisions, and why the bigger of the two are OEM only, maybe because they are the 80nm ones, and the rest are 65nm. I cant find the article myself, feel free to have a look around their website. http://uk.theinquirer.net/


Andy

~El~Jefe~
Friend of SPCR
Posts: 2887
Joined: Mon Feb 28, 2005 4:21 pm
Location: New York City zzzz
Contact:

Post by ~El~Jefe~ » Fri Mar 16, 2007 1:39 pm

yes, i just read this.

fools will shell out for the r600 80nm ew.

terrible card.

65nm mmmmmmmmmm thats hawt. That with like slightly reduced ram on it and like 10% underclock on mem speed and there will be the all in wonder single slot solution like the x1900 aiw is that I own.... mmmm.....

my card uses very little wattage for its computing power.

rei
Posts: 967
Joined: Wed Dec 08, 2004 11:36 am

Post by rei » Fri Mar 16, 2007 6:10 pm

jazkat: for a first post, that was utterly inane. we don't want you if you're going to spew run-on sentences full of fanboy-crap.

nutball
*Lifetime Patron*
Posts: 1304
Joined: Thu Apr 10, 2003 7:16 am
Location: en.gb.uk

Post by nutball » Sat Mar 17, 2007 2:28 am

I'm having problems believing the 65nm stories. Or rather I'm having problems believing that R600 will be 65nm and be available in the next six months. A change from 80nm to 65nm isn't the sort of thing one can do overnight just by changing a dial on the fab. In effect what's being announced is that R600 has been canned and they're launching R650 (or whatever the Q3'07 refresh was going to be called). This all sounds like wishful thinking to me.

Tzupy
*Lifetime Patron*
Posts: 1561
Joined: Wed Jan 12, 2005 10:47 am
Location: Bucharest, Romania

Post by Tzupy » Sat Mar 17, 2007 2:48 am

IMO by using the 65 nm process instead of 80 nm the power would drop by about 20%, making it manageable at lower noise.
But this could happen only if ATI had another team working on the 65 nm, maybe getting some help from AMD engineers and testing facilities.

nutball
*Lifetime Patron*
Posts: 1304
Joined: Thu Apr 10, 2003 7:16 am
Location: en.gb.uk

Post by nutball » Sat Mar 17, 2007 4:48 am

Well it's quite likely that work on a 65nm version will have been going on for a fair while now, in parallel to the 80nm version. That sort of parallel development is all quite normal by my understanding.

What's bugging me more isn't so much the design as the manufacturing. There have been no concrete rumours that AMD would be making these things in their own fabs (given that they're heavily supply constrained on their CPUs, wasting fab capacity to build GPUs would be nigh on suicidal I think). So the question is is TSMC up to building such a large chip on their 65nm process? All the stuff I'm reading elsewhere suggests that it isn't at the moment, and won't be until ~Q3 this year. So we're into the realm of speculating that some miracle has occurred which will allow volume availability 3-4 months before it was initially expected.

Tzupy
*Lifetime Patron*
Posts: 1561
Joined: Wed Jan 12, 2005 10:47 am
Location: Bucharest, Romania

Post by Tzupy » Sat Mar 17, 2007 5:36 am

If the testing was done at TSMC, then no, the 65 nm yields on such a large chip would be crappy at this moment.
But maybe ATI tested the 65 nm chip at Dresden, got lucky there and managed to 'transplant' the manufacturing settings at TSMC?
It seems that there will be some 80 nm R600 in the channel, the ones at CeBIT are probably still 80 nm. Unlucky those who'll buy them...

ronrem
Posts: 1066
Joined: Sun Jan 16, 2005 2:59 am
Location: Santa Cruz

Post by ronrem » Tue Mar 20, 2007 5:55 am

The 610 "Antelope cards will draw 25 w and be passive. There are pics of an MSI with nice looking heatpipes and HDMI. 610 "Falcons" will take 35 w use fans (so far)

The heavyweight is the RV630 " Kohinoor" with 512 mb of GDDR-4 ram,Vivo,dual dual link DVI,HDMI and a draw of 128 W.

The low end 630 "Sefadu" uses DDR2 and sucks less than 75 w

All are DX10 and 65 nm. I don't know what's up with an alleged giant 300 w card,hoax? real? insane?

Tzupy
*Lifetime Patron*
Posts: 1561
Joined: Wed Jan 12, 2005 10:47 am
Location: Bucharest, Romania

Post by Tzupy » Tue Mar 20, 2007 8:44 am

Latest info on R600 from vr-zone:

Just a recap of what we have told you before, R600XTX retail card is codenamed Dragonhead 2. It is a 12-layer card at 9.5" long with a max TDP of 240W. It is clocked at 800MHz on 80nm process technology, 512-bit memory interface and has 1GB GDDR4 memories onboard. It has 6-pin and 8-pin PCIe connectors but two 2x3 PCIe power can be used. Rumors surfaced over at CeBIT that the final product may be on 65nm if it can be produced in time with reasonable good yield. So far we have yet to get a confirmation that the first shipping R600 cards will be on 65nm but we can't rule out that there are experimental R600 chips at 65nm now. If our source is right, the yield at 65nm is poor at this point of time and expect a limited quantity of 20,000 pieces of R600XTX by middle of April.

Next, we got hold of some preliminary benchmarks figures of the R600 XTX card with core clock at 800MHz vs a GeForce 8800 GTX card. Using a Core 2 Extreme 2.93GHz processor on an Intel 975X board, the 3DMark06 score at 1600x1200 resolution is 97xx on the R600XTX compared to 95xx on the 8800GTX. Seems like R600XTX is running slightly faster than 8800GTX on the early release of drivers for R600. AMD is still working hard on the drivers and there are some more performance left to unlock. However, the DX10 benchmarking war between ATi and NVIDIA has not started yet. The targeted display driver version for launch in May is 8.361 or later (Catalyst 7.4 or 7.5).

~El~Jefe~
Friend of SPCR
Posts: 2887
Joined: Mon Feb 28, 2005 4:21 pm
Location: New York City zzzz
Contact:

Post by ~El~Jefe~ » Tue Mar 27, 2007 1:07 pm

I never like the way nvidia does AF and AA.

A current review of several 320Mb 8800 GTS's on anandtech.com showed that when AA is on, the 320mb versions crap out. x1900's dont take a hit when you raise AA/AF. Then, when it comes to HDR plus aa/af, it craps out as well (normally, not tested on anand yet however). people always say: Performance is great on the 7xxx/8xxx but image quality is best on x19xx series.

what the hell is a gfx card for? IMages right? I would rather have a nvidia series card for its linux backup and also better pricing, but eye candy is king.

65nm is the 2nd edition of the card. It won't be noticably different in terms of packaging or model numbers. It is going to be annoying trying to get one of those locally vs the 80nm ones.

I am glad amd is in charge now.

andyb
Patron of SPCR
Posts: 3307
Joined: Wed Dec 15, 2004 12:00 pm
Location: Essex, England

Post by andyb » Thu Apr 12, 2007 11:51 am

Loads of pics, thatks to [H]ardOCP.

This is the huge version that is supposed to be for OEM's only and is thought to be 80nm, while the shorter version is going to be the mass manufactured part and is thought to be a 65mn bit of chippery.

For your perusal.

http://www.hardocp.com/news.html?news=M ... V3cywsLDE=


Andy

Kaleid
Posts: 254
Joined: Mon Oct 11, 2004 9:43 am
Location: Sweden

Post by Kaleid » Thu Apr 12, 2007 2:59 pm

Looks like a Thermalright which is positive. Simple remove the fan that comes with the card and the plastic surrounding the heatsink and attach one or two better and more silent fans to cool the beast down.

~El~Jefe~
Friend of SPCR
Posts: 2887
Joined: Mon Feb 28, 2005 4:21 pm
Location: New York City zzzz
Contact:

Post by ~El~Jefe~ » Fri Apr 13, 2007 8:57 pm

I just read... um forgot where... inquirer?? that the wattage for the r600 in the 65nm versions will be significantly LESS than nvidia offerings.

no more than a 100 watts on draw is the rumour for it. seems like it will be a beast.

elec999
Posts: 273
Joined: Wed Jun 30, 2004 10:54 pm

Post by elec999 » Fri Apr 13, 2007 10:45 pm

GPUs will really be getting less powercompusions and quieter. Even if they have to use multiple gpus.
Thanks

~El~Jefe~
Friend of SPCR
Posts: 2887
Joined: Mon Feb 28, 2005 4:21 pm
Location: New York City zzzz
Contact:

Post by ~El~Jefe~ » Sun Apr 15, 2007 8:22 pm

elec999 wrote:GPUs will really be getting less powercompusions and quieter. Even if they have to use multiple gpus.
Thanks
:?: :?: :?: :?:

Techno Pride
Posts: 347
Joined: Tue Jun 03, 2003 12:57 am

Post by Techno Pride » Sun Apr 15, 2007 9:32 pm

jazkat wrote:
Happy Hopping wrote:Since Nvidia is rel. 8900 and 8900GX2, this R600 is meaningless to them

pffft good one brains the r600 will trounce the 8900 series too do your homework also r600 will have 64 x4 = 256 unfied shaders the 8900 only 128, i wouldnt be suprised if the r600 slashes over the 8950x2 either
r600 is 512bit, you got to remember amd is in the game now and because of the transition is why the r600 is late also the vapo chill coolers arent ready and they are trying to get the power usage down mmmm i just thinking of the 179gb/s memory bandwidth.

also the intel thing people are rattling on how crap amd are now cause of intels new chips well that wasnt the case when amd released the 64 bit was it? it looks like amd is going to trounce the new intel chips again hahahhaah with the 128 bit barcelona using csi.

when daamit get the ball rolling nvidia dont satnd a chance sorry,plus when i buy my r600 i know it will work properly with vista as daamt have been workin alongside microsoft so there will be no blue screen of death and pixel corruption like u get with the 8800.
you wont be seing the big card because so dont worry i think they are for the apple mac pro and pc manufacturers

when daamit start selling there cpu+grafix card pakage ull have the ultimate gaming rig simple as that!
please type properly.

~El~Jefe~
Friend of SPCR
Posts: 2887
Joined: Mon Feb 28, 2005 4:21 pm
Location: New York City zzzz
Contact:

Post by ~El~Jefe~ » Tue Apr 17, 2007 12:16 pm

that sort of writing sux

thejamppa
Posts: 3142
Joined: Mon Feb 26, 2007 9:20 am
Location: Missing in Finnish wilderness, howling to moon with wolf brethren and walking with brother bears
Contact:

Post by thejamppa » Wed Apr 25, 2007 7:30 am

The new tests results HD2900XT vs 8800GTS show that ATI is equal terms with GTS at worse and at the best its significantly faster. Is these figures hold true, then HD2900XTX should be tough one for 8800GTX. We'll see. In few weeks Ati will come forth and we get first new Ati's. I am excited ^^

Here's link to test results, but I can't say they are 100% correct.
http://plaza.fi/s/f/editor/images/04_24_hd2900xt.gif

Kaleid
Posts: 254
Joined: Mon Oct 11, 2004 9:43 am
Location: Sweden

Post by Kaleid » Wed Apr 25, 2007 7:52 am

If those numbers are true then Ati really has failed with it's chip. The higher-end cards use after 512-bit memory busses so it should pretty much crush Nvidia's offerings.

Hopefully if the benchmarks are true then it's because of poor drivers.

thejamppa
Posts: 3142
Joined: Mon Feb 26, 2007 9:20 am
Location: Missing in Finnish wilderness, howling to moon with wolf brethren and walking with brother bears
Contact:

Post by thejamppa » Wed Apr 25, 2007 9:23 am

Kaleid wrote:If those numbers are true then Ati really has failed with it's chip. The higher-end cards use after 512-bit memory busses so it should pretty much crush Nvidia's offerings.

Hopefully if the benchmarks are true then it's because of poor drivers.
i was kinda in impression that only XTX had 512bit interface and XT had lower... But still, 8800GTS has 320 bit interface. Still Ati manage to best 8800GTS pretty well.

But we'll see more results in few weeks. And then users in SPCR can do their own tests.

~El~Jefe~
Friend of SPCR
Posts: 2887
Joined: Mon Feb 28, 2005 4:21 pm
Location: New York City zzzz
Contact:

Post by ~El~Jefe~ » Wed Apr 25, 2007 6:13 pm

at those crappy resolutions, you cannot tell how good one of those cards is.

they need to do them at least at 1920x1200 to show anything worth noting. at that level, lots of middle of the road cards perform well against top contenders.

similar results were shown pitting the prescott vs amd64 754 chips for a while before they came out..... the EE's were doing 20% better....... then it was shown that a 2.2ghz chip was faster than a 3.73 extreme when it actually came out. i just remember that and feel that is the way this is going to be.

samuelmorris
Posts: 168
Joined: Thu Mar 22, 2007 1:00 pm
Location: York, UK

Post by samuelmorris » Thu Apr 26, 2007 4:01 am

2560x1600 benches plz! LOL, that's what I'll be buying the card for. However, I'm hoping that the 240W power consumption isn't going to be true, because near double the power (and therefore heat, and way more than double the noise) for the sake of a handful of extra frames sees my buy an nVidia card. My first since a GeForce 4MX. I've happily silenced my X1900XT after all this time with an HR-03, and you can do the same with an 8800GTX. However, the sort of heat these cards are rumoured to use is something that only water cooling could deal with at less dBs than a pneumatic drill, and thanks but no thanks. What concerns me about the supposed stock cooler is that not only is it a fan rated at 2A (the same as a Delta 92mm fan, and it's smaller!) but also, it's IDENTICAL to the X1900XT fan that, get this, draws 0.42A. That R580 leafblower is the loudest fan I've ever owned, even before I became interesting in getting a quiet PC. 5 times the current for the same design can only mean one thing. Place your bets folks, 10,000rpm?
That's going to annoy people who use Delta fans, let alone us SilentPC enthusiasts!

spookmineer
Patron of SPCR
Posts: 749
Joined: Sat Nov 11, 2006 6:02 pm

8800 GTX vs 2900 XTX benchmarks

Post by spookmineer » Thu Apr 26, 2007 12:43 pm

Benchmark results (1280x1024):

Company of Heroes
Radeon HD 2900 XTX: 97,1 FPS
GeForce 8800 GTX: 128,6 FPS

F.E.A.R.
Radeon HD 2900 XTX: 84 FPS
GeForce 8800 GTX: 125 FPS

Half Life 2: Episode 1
Radeon HD 2900 XTX: 117,9 FPS
GeForce 8800 GTX: 157,1 FPS

Elder Scrolls IV: Oblivion
Radeon HD 2900 XTX: 100,3 FPS
GeForce 8800 GTX: 110,5 FPS
Benchmark results (1920x1200):

Company of Heroes
Radeon HD 2900 XTX: 53,2 FPS
GeForce 8800 GTX: 80,0 FPS

F.E.A.R.
Radeon HD 2900 XTX: 53,7 FPS
GeForce 8800 GTX: 81,7 FPS

Half Life 2: Episode 1
Radeon HD 2900 XTX: 68,2 FPS
GeForce 8800 GTX: 100,2 FPS

Elder Scrolls IV: Oblivion
Radeon HD 2900 XTX: 75,1 FPS
GeForce 8800 GTX: 98,4 FPS
That is not the news some people were hoping for...


Source

ryboto
Friend of SPCR
Posts: 1439
Joined: Tue Dec 14, 2004 4:06 pm
Location: New Hampshire, US
Contact:

Re: 8800 GTX vs 2900 XTX benchmarks

Post by ryboto » Thu Apr 26, 2007 4:15 pm

spookmineer wrote:
Benchmark results (1280x1024):

Company of Heroes
Radeon HD 2900 XTX: 97,1 FPS
GeForce 8800 GTX: 128,6 FPS

F.E.A.R.
Radeon HD 2900 XTX: 84 FPS
GeForce 8800 GTX: 125 FPS

Half Life 2: Episode 1
Radeon HD 2900 XTX: 117,9 FPS
GeForce 8800 GTX: 157,1 FPS

Elder Scrolls IV: Oblivion
Radeon HD 2900 XTX: 100,3 FPS
GeForce 8800 GTX: 110,5 FPS
Benchmark results (1920x1200):

Company of Heroes
Radeon HD 2900 XTX: 53,2 FPS
GeForce 8800 GTX: 80,0 FPS

F.E.A.R.
Radeon HD 2900 XTX: 53,7 FPS
GeForce 8800 GTX: 81,7 FPS

Half Life 2: Episode 1
Radeon HD 2900 XTX: 68,2 FPS
GeForce 8800 GTX: 100,2 FPS

Elder Scrolls IV: Oblivion
Radeon HD 2900 XTX: 75,1 FPS
GeForce 8800 GTX: 98,4 FPS
That is not the news some people were hoping for...




Source
As many users at TPU forums have said, these benchmarks don't coincide well with the HD 2900XT benchmarks released a few days ago. In fact, the HD2900XT actually outperformed the XTX on several tests. So, something isn't right with those scores. I suspect it's a driver issue, but who knows.

Post Reply