It is currently Sat Oct 25, 2014 5:02 am

All times are UTC - 8 hours




Post new topic Reply to topic  [ 28 posts ] 
Author Message
 Post subject: NVidia 9600GSO on the way out. Who's next for great PPW?
PostPosted: Thu Dec 18, 2008 1:46 pm 
Offline
*Lifetime Patron*

Joined: Fri Apr 04, 2003 6:21 pm
Posts: 4247
Location: Undisclosed but sober in US
There are only 2 cards left on NewEgg, and the follow up on the best cards per watt has been somewhat lacking.

I may have to buy some of these off of Amazon. Or can you recommend a good alternative?

A

_________________
People who put money and political ideology ahead of truth and ethics are neither patriots nor human beings.


Top
 Profile  
 
 Post subject: Re: NVidia 9600GSO on the way out. Who's next for great PPW?
PostPosted: Thu Dec 18, 2008 2:59 pm 
Offline

Joined: Tue Dec 13, 2005 1:08 pm
Posts: 1407
Location: Michigan
aristide1 wrote:
I may have to buy some of these off of Amazon. Or can you recommend a good alternative?

The 55nm 9600GT?

The 9600GSO should disappear. It should be faulty 8800GT/9800GT or 9600GTs. If there aren't many faulty chips--the full 55nm 9600GTs should just as cheap or cheaper to make, so those should continue.

And the 40nm Radeon HD 4670 is coming.


Top
 Profile  
 
 Post subject:
PostPosted: Thu Dec 18, 2008 8:24 pm 
Offline
Friend of SPCR

Joined: Wed Jan 16, 2008 8:56 pm
Posts: 356
Location: Council Bluffs, Iowa
Yep. 9600GT is better performance per watt than a 9600GSO, although it's running at a higher wattage. You could easily underclock it to GSO levels if you were really concerned, but at that point you might also want to look at an HD4670. It's slower, but it's pretty much the king of all performance-per-watt comparisons.


Top
 Profile  
 
 Post subject:
PostPosted: Thu Dec 18, 2008 8:27 pm 
Offline
*Lifetime Patron*

Joined: Fri Apr 04, 2003 6:21 pm
Posts: 4247
Location: Undisclosed but sober in US
Are all 9600GTs 55nm or is that a development that came after its release?

_________________
People who put money and political ideology ahead of truth and ethics are neither patriots nor human beings.


Top
 Profile  
 
 Post subject:
PostPosted: Thu Dec 18, 2008 11:51 pm 
Offline
Friend of SPCR

Joined: Wed Jan 16, 2008 8:56 pm
Posts: 356
Location: Council Bluffs, Iowa
That's a development that came after the card's release. This means great fun trying to find the small-process card; the only difference is the detailed designation on the GPU, which won't be available to you until you've already paid for it. You want a G94-300-B1 card, as opposed to a G94-300-A1 card. Unfortunately, you'd have the same problem with a GSO.

I've got the 65nm version, and it's been cool-running and trouble-free with a passive AC S2. If you're so concerned about power usage that you don't want a 65nm 9600, I suggest you buy the slightly slower HD 4670. It uses roughly 20 watts fewer than a 9600gt at any load level (idling at 3 watts).


Top
 Profile  
 
 Post subject:
PostPosted: Sun Dec 28, 2008 9:15 am 
Offline
*Lifetime Patron*

Joined: Fri Apr 04, 2003 6:21 pm
Posts: 4247
Location: Undisclosed but sober in US
NVidia is suppose to renumber all their GPUs/cards again. When thay happens then the 9600GT will probably be all 55nm. Until then I won't even consider another GPU.

ATI burned me in the past. By the time they released stable drivers for my last ATI card the thing was an antique. Then they dropped it. Drop this.

_________________
People who put money and political ideology ahead of truth and ethics are neither patriots nor human beings.


Top
 Profile  
 
 Post subject:
PostPosted: Sat Jan 10, 2009 6:54 pm 
Offline
*Lifetime Patron*

Joined: Fri Apr 04, 2003 6:21 pm
Posts: 4247
Location: Undisclosed but sober in US
Most 9600GSO cards are 48 stream processors, except for a fewer latter models. 8800GSs were 96 stream processors.

No matter, the 9600GSOs are almost all gone. NewEgg has only 1 model of 96 processors 9600GSO and it's open box. They are almost totally gone.

The 9600GT also has 48 stream processors. You need around 100 for best PPW.

Worse yet, the cards that have moved from 65nm to 55nm use about the same power.

Idiots.

_________________
People who put money and political ideology ahead of truth and ethics are neither patriots nor human beings.


Top
 Profile  
 
 Post subject:
PostPosted: Sat Jan 10, 2009 7:55 pm 
Offline

Joined: Tue Dec 13, 2005 1:08 pm
Posts: 1407
Location: Michigan
aristide1 wrote:
The 9600GT also has 48 stream processors. You need around 100 for best PPW.

Worse yet, the cards that have moved from 65nm to 55nm use about the same power.

9600GT is always 64 shaders. The new GSO is 48--it is stock clocked a lot higher than the old GSO, and it has the wider bus. But yeah I just got a OC model 9600GT that is considerably slower than my 96 shader GSO. The 55nm cards should use less power.


Top
 Profile  
 
 Post subject:
PostPosted: Sat Jan 10, 2009 8:25 pm 
Offline
Friend of SPCR

Joined: Sun Mar 21, 2004 5:47 pm
Posts: 867
Location: Phoenix, AZ
aristide1 wrote:
Worse yet, the cards that have moved from 65nm to 55nm use about the same power.


Actually the worst part is that you can't tell if you're buying a 55nm part until after you have it and remove the heatsink to check the model number on the chip. That's the only way to be sure of what it is. The other clue can be in the Device ID reported to the OS, for example in the 9800GT cards, device 614 is the 65nm part and 605 is the 55nm part. Again, no way to know which part the card is until you own it and install it.

_________________
Phenom 1090T / 9800GTX+ / Antec P180 / Seasonic S12-600


Top
 Profile  
 
 Post subject:
PostPosted: Sat Jan 10, 2009 8:36 pm 
Offline
*Lifetime Patron*

Joined: Fri Apr 04, 2003 6:21 pm
Posts: 4247
Location: Undisclosed but sober in US
You're both correct. I have seen 2 articles on 55nm chips. In both cases it used less power, but the difference was less than 10 watts.

Right now Legoman666 has no PPD on any 9600GSO. I'd like to see the power consumption for a 9800GT but I haven't found any yet. I checked XbitLabs, I wish they kept a spreadsheet of these stats. It's not cheap folding.

I checked amazon and Ebay, there's a handful of XFX 9600s left. They appear to be 48 processors.

Nobody is selling a new or used 8800GS anywhere.

_________________
People who put money and political ideology ahead of truth and ethics are neither patriots nor human beings.


Top
 Profile  
 
 Post subject:
PostPosted: Sat Jan 10, 2009 9:31 pm 
Offline

Joined: Tue Dec 13, 2005 1:08 pm
Posts: 1407
Location: Michigan
aristide1 wrote:
Nobody is selling a new or used 8800GS anywhere.

You could always make me an offer.


Top
 Profile  
 
 Post subject:
PostPosted: Sun Jan 11, 2009 7:58 am 
Offline
Friend of SPCR

Joined: Sun Mar 21, 2004 5:47 pm
Posts: 867
Location: Phoenix, AZ
aristide1 wrote:
I'd like to see the power consumption for a 9800GT but I haven't found any yet. I checked XbitLabs, I wish they kept a spreadsheet of these stats. It's not cheap folding.


I purchased a 9800GT just a week ago, it showed up and I did testing on it but it squealed so bad that I returned it. I have a kill-a-watt and can tell you the power draw at the outlet:

Idle w/ 7800GT: 98w
Idle w/ 9800GT: 115w
FAH-GPU: 199w

That was with the latest 181.20 drivers from nvidia so there was no CPU load while the GPU client was running. Since the 7800GT is supposed to be one of the lowest power idle cards from years gone by, with "only" something like 16-18w idle draw, that puts the true idle of my system around 80w A/C, putting the 9800GT's A/C draw around 120w A/C. Since it has a TDP of 105 watts DC and FAH probably only mostly maxes it out, 120w A/C sounds about right to me. That was at the card's factory settings of 650/1620/1800 for core, shaders, memory. Over/underclocking the card only made about a 10w difference in one direction or the other.

As for PPD, here's how it broke down relative to a bone stock 9800GT as the top line of the chart:

Image

_________________
Phenom 1090T / 9800GTX+ / Antec P180 / Seasonic S12-600


Top
 Profile  
 
 Post subject:
PostPosted: Sun Jan 11, 2009 8:10 am 
Offline
*Lifetime Patron*

Joined: Fri Apr 04, 2003 6:21 pm
Posts: 4247
Location: Undisclosed but sober in US
Oh AZBrandon that's harsh. No thanks. I found one site that showed the 9800GTX actually used a watt or 2 less.

Hey NVidia, what happened to GREEN?

_________________
People who put money and political ideology ahead of truth and ethics are neither patriots nor human beings.


Top
 Profile  
 
 Post subject:
PostPosted: Sun Jan 11, 2009 12:25 pm 
Offline
*Lifetime Patron*

Joined: Fri Apr 04, 2003 6:21 pm
Posts: 4247
Location: Undisclosed but sober in US
OW! OW! OW!

My AMD Dual core has 2 GPU slots. I had one GPU running under XP, hence 100% of 1 GPU and 50% CPU to go with it. Total power 150 watts.

Now I've updated the drivers, and almost 0% CPU for the GPU folding, which is nice. I added the 2nd GPU card AND I added SMP folding as well, so now 2 GPU folders and 100% of dual core running, total power - 290 watts!

I hate buying electricity. :x

_________________
People who put money and political ideology ahead of truth and ethics are neither patriots nor human beings.


Top
 Profile  
 
 Post subject:
PostPosted: Sun Jan 11, 2009 6:17 pm 
Offline
Friend of SPCR

Joined: Sun Mar 21, 2004 5:47 pm
Posts: 867
Location: Phoenix, AZ
aristide1 wrote:
OW! OW! OW!

My AMD Dual core has 2 GPU slots. I had one GPU running under XP, hence 100% of 1 GPU and 50% CPU to go with it. Total power 150 watts.

Now I've updated the drivers, and almost 0% CPU for the GPU folding, which is nice. I added the 2nd GPU card AND I added SMP folding as well, so now 2 GPU folders and 100% of dual core running, total power - 290 watts!

I hate buying electricity. :x


That's still better than what I was looking at: about 250 watts for a single 9800GT plus the SMP client on my dual-core opteron. Yes, you're at 40 watts more, but with a whole second GPU folding for you. I'm still up in the air about what to do next. It sounds like ATI uses better quality electronics on their cards for voltage control and whatever, which is supposed to eliminate any chance at squealing and whatever, but for FAH they are so much slower than nvidia. What to do, what to do? I wish the 9600GT was $50 or so, then I'd just get one of those for folding. Should be worth maybe 2000ppd or so and at half the cost of a 4830 which likely does about 2000ppd also.

_________________
Phenom 1090T / 9800GTX+ / Antec P180 / Seasonic S12-600


Top
 Profile  
 
 Post subject:
PostPosted: Mon Jan 12, 2009 7:27 am 
Offline
*Lifetime Patron*

Joined: Fri Apr 04, 2003 6:21 pm
Posts: 4247
Location: Undisclosed but sober in US
So far things don't look so good. 9600GT cards are too expensive for what they are, though they are very efficient. 9800GSO - you need to buy the right card, must have 96 stream processors, some older models have less. Keep your ATI for now.

On top of all that the 55nm version of the NVidia chips that are out there show a very small drop in watts used over the identical function 65nm version.

I've never had any cards squeal, and have heard that some people will put a blob of putty or may glue a heatsink to any MOSFETs that care to sing.

ATI cards may be better parts, but everyone else is going to use whatever parts suits them, also known as the lowest bidder.

I'm going to lower CPU voltage tonight to see if how much that helps.

_________________
People who put money and political ideology ahead of truth and ethics are neither patriots nor human beings.


Top
 Profile  
 
 Post subject:
PostPosted: Mon Jan 12, 2009 1:58 pm 
Offline
*Lifetime Patron*

Joined: Fri Apr 04, 2003 6:21 pm
Posts: 4247
Location: Undisclosed but sober in US
AZBrandon wrote:
That's still better than what I was looking at: about 250 watts for a single 9800GT plus the SMP client on my dual-core opteron.

I'd say that's correct, an additional 50-50 watts or so for the processor and the GPU at full tilt. This is why there's so much talk about points per watt as opposed to points per day.

Do you use this PC for anything else? A 45watt BE X2 dual core can be had for $40.

_________________
People who put money and political ideology ahead of truth and ethics are neither patriots nor human beings.


Top
 Profile  
 
 Post subject:
PostPosted: Tue Jan 13, 2009 6:16 am 
Offline
Friend of SPCR

Joined: Sun Mar 21, 2004 5:47 pm
Posts: 867
Location: Phoenix, AZ
aristide1 wrote:
Do you use this PC for anything else? A 45watt BE X2 dual core can be had for $40.


I have a Socket 939 board, so I'm already at the end of the line with my 2.6ghz dual core. I'll just keep an eye on new product developments and either snag another video card when the price is right or power consumption drops down with the switch to 40nm, or whatever. Thanks for the tip though!

_________________
Phenom 1090T / 9800GTX+ / Antec P180 / Seasonic S12-600


Top
 Profile  
 
 Post subject:
PostPosted: Tue Jan 13, 2009 1:46 pm 
Offline
*Lifetime Patron*

Joined: Fri Apr 04, 2003 6:21 pm
Posts: 4247
Location: Undisclosed but sober in US
http://vr-zone.com/articles/nvidia-40nm ... l?doc=6359

It's about time!

AZB! http://www.behardware.com/articles/739- ... q6600.html

_________________
People who put money and political ideology ahead of truth and ethics are neither patriots nor human beings.


Top
 Profile  
 
 Post subject:
PostPosted: Tue Jan 13, 2009 5:39 pm 
Offline
*Lifetime Patron*

Joined: Fri Apr 04, 2003 6:21 pm
Posts: 4247
Location: Undisclosed but sober in US
The P945 chipset has also been shrunk from the prior 90nm process to a 65nm process. Can't say about the others except AMD chipsets seem generally more efficient than NVidia chipsets.

_________________
People who put money and political ideology ahead of truth and ethics are neither patriots nor human beings.


Top
 Profile  
 
 Post subject:
PostPosted: Thu Jan 15, 2009 5:04 pm 
Offline
*Lifetime Patron*

Joined: Fri Apr 04, 2003 6:21 pm
Posts: 4247
Location: Undisclosed but sober in US
Hey AZB, I just lowered the cpu voltage from 1.34 to 1.29 and I reduced power consumption by 15 watts.

_________________
People who put money and political ideology ahead of truth and ethics are neither patriots nor human beings.


Top
 Profile  
 
 Post subject:
PostPosted: Sun Jan 18, 2009 6:02 am 
Offline
Friend of SPCR

Joined: Sun Mar 21, 2004 5:47 pm
Posts: 867
Location: Phoenix, AZ
Just FYI, I got a new 9800GT on Friday. This time I got a Gigabyte after I saw they have a utility to modify the voltage of the card. You can undervolt it to 1.05v. Much to my surprise, it ran WAY lower power than my eVGA card. Upon comparing the Device ID in GPU-Z, the eVGA was a 0614 card which is supposed to be the 65nm part and the Gigabyte is 0604, so it's a 55nm chip, or at least seems to be. The figures are as such:

Whole-system power draw A/C:
Device 0614 idle: 115 watts
Device 0605 idle: 115 watts
Device 0614 FAH-GPU: 199 watts
Device 0605 FAH-GPU: 171 watts
Device 0605 1.05v FAH-GPU: 165 watts

So basically a 28 to 34 watt savings depending on if you run at the stock 1.1v or undervolt it to 1.05v. I strongly suspect many reviews of supposed 55nm parts may be 65nm parts. As I mentioned previously, I've even seen a review of the 9800GTX+ which is ONLY supposed to use a 55nm part and when the reviewer pulled the heatsink, it turned out to be a 65nm chip.

As for my own card, my new one isn't as good of an overclocker as the old card, topping out at 1836mhz on the shaders and 2030 or so on memory, although I run 1782 and 1910 for folding for reduced heat and improved reliability. Running the stock fan at 5v it holds around 60C in FAH-GPU at 1.05v. I've got an ASUS card on order too and have hopes it too will be a 55nm part, but of course there's no guarantees. We'll see when it arrives!

_________________
Phenom 1090T / 9800GTX+ / Antec P180 / Seasonic S12-600


Top
 Profile  
 
 Post subject:
PostPosted: Sun Jan 18, 2009 7:34 am 
Offline

Joined: Wed Jul 20, 2005 10:56 am
Posts: 323
Location: USA
AZBrandon wrote:
Just FYI, I got a new 9800GT on Friday. This time I got a Gigabyte after I saw they have a utility to modify the voltage of the card. You can undervolt it to 1.05v. Much to my surprise, it ran WAY lower power than my eVGA card. Upon comparing the Device ID in GPU-Z, the eVGA was a 0614 card which is supposed to be the 65nm part and the Gigabyte is 0604, so it's a 55nm chip, or at least seems to be. The figures are as such:

Whole-system power draw A/C:
Device 0614 idle: 115 watts
Device 0605 idle: 115 watts
Device 0614 FAH-GPU: 199 watts
Device 0605 FAH-GPU: 171 watts
Device 0605 1.05v FAH-GPU: 165 watts

So basically a 28 to 34 watt savings depending on if you run at the stock 1.1v or undervolt it to 1.05v. I strongly suspect many reviews of supposed 55nm parts may be 65nm parts. As I mentioned previously, I've even seen a review of the 9800GTX+ which is ONLY supposed to use a 55nm part and when the reviewer pulled the heatsink, it turned out to be a 65nm chip.

As for my own card, my new one isn't as good of an overclocker as the old card, topping out at 1836mhz on the shaders and 2030 or so on memory, although I run 1782 and 1910 for folding for reduced heat and improved reliability. Running the stock fan at 5v it holds around 60C in FAH-GPU at 1.05v. I've got an ASUS card on order too and have hopes it too will be a 55nm part, but of course there's no guarantees. We'll see when it arrives!
You have a PPW rundown for that?

_________________
ATCS840/ Seasonic X-750/ 2500k 4.4 GHz (Koolance CPU-370)/ Gigabyte z68xp-ud4/ eVGA 570 865,1730,1950 (Koolance VID-580)/ 8Gb G.Skill DDR3 1866/ Corsair Force 3 120Gb, Samsung 470 128 Gb, Scorpio Black 750 2.5" (Quiet Drive)/ EK DCP-2.2 (res., gel-stuff)/ XSPC RX360, Swiftech MCR220QP/ 3 Scythe GT ~300-550RPM, 3 NB Multi-frame ~400-600RPM, 3 230mm 300RPM-> Lamptron FC-4/ Akasa AcoustiPack


Top
 Profile  
 
 Post subject:
PostPosted: Sun Jan 18, 2009 7:53 pm 
Offline
Friend of SPCR

Joined: Sun Mar 21, 2004 5:47 pm
Posts: 867
Location: Phoenix, AZ
warriorpoet wrote:
You have a PPW rundown for that?


Same as any other 8800/9800 GT card. With shaders at 1782, it does roughly 4600ppd on the older 384 point WU's and about 3500 ppd on the newer 511 point WU's. Seems that the larger the WU's get, the smaller nVidia's advantage becomes. Sadly enough, I feel like I missed the golden age of GPU2 folding, having only just gotten my first CUDA-capable video card so recently.

_________________
Phenom 1090T / 9800GTX+ / Antec P180 / Seasonic S12-600


Top
 Profile  
 
 Post subject:
PostPosted: Tue Jan 20, 2009 6:54 am 
Offline
*Lifetime Patron*

Joined: Fri Apr 04, 2003 6:21 pm
Posts: 4247
Location: Undisclosed but sober in US
AZBrandon wrote:
.... The figures are as such:

Whole-system power draw A/C:
Device 0614 idle: 115 watts
Device 0605 idle: 115 watts
Device 0614 FAH-GPU: 199 watts
Device 0605 FAH-GPU: 171 watts
Device 0605 1.05v FAH-GPU: 165 watts

So basically a 28 to 34 watt savings depending on if you run at the stock 1.1v or undervolt it to 1.05v....


Yes much better, but still rather high for a multi-GPU (hopefully 4 card) full time folder. Even with 2 cards I'd rather wait a while.

AZBrandon wrote:
Sadly enough, I feel like I missed the golden age of GPU2 folding, ....


To me the Golden Age was the 8800GT 256MB which for some reason was a real power miser.

ALSO - There may be a 35 watt X2 Dual Core 3800 out there (2GHz) for socket 939. You may find one used and they are still THE most economical dual core AMD has ever made.

I've started getting an occasional error that claims my card is unstable and it plans on waiting 24 hours before it resumes. The thing is my cards have been stable for months, and they are not running that not, in the 65C range and shaders at 1800. This is the one thing I hate about folding, I always need to keep my eye on it.

_________________
People who put money and political ideology ahead of truth and ethics are neither patriots nor human beings.


Top
 Profile  
 
 Post subject:
PostPosted: Tue Jan 20, 2009 5:45 pm 
Offline
Friend of SPCR

Joined: Sun Mar 21, 2004 5:47 pm
Posts: 867
Location: Phoenix, AZ
aristide1 wrote:
I've started getting an occasional error that claims my card is unstable and it plans on waiting 24 hours before it resumes. The thing is my cards have been stable for months, and they are not running that not, in the 65C range and shaders at 1800. This is the one thing I hate about folding, I always need to keep my eye on it.


Yeah, everyone's getting that now, they had to start a sticky thread about it even:

http://foldingforum.org/viewtopic.php?f=52&t=7965

_________________
Phenom 1090T / 9800GTX+ / Antec P180 / Seasonic S12-600


Top
 Profile  
 
 Post subject:
PostPosted: Tue Jan 20, 2009 7:48 pm 
Offline
*Lifetime Patron*

Joined: Fri Apr 04, 2003 6:21 pm
Posts: 4247
Location: Undisclosed but sober in US
ADD3800IAA5CU - This is the 35 watt dual core, but I haven't found it in socket 939 yet. Some of the early fast dual cores for 939 are now very expensive.

_________________
People who put money and political ideology ahead of truth and ethics are neither patriots nor human beings.


Top
 Profile  
 
 Post subject:
PostPosted: Fri Jan 30, 2009 8:20 pm 
Offline
*Lifetime Patron*

Joined: Fri Apr 04, 2003 6:21 pm
Posts: 4247
Location: Undisclosed but sober in US
Well I have been trying to do some research on where NVidia is headed and when. The news is good or bad, depending on what you read and when.

NVidia seems or seemed to be gearing up for a second quarter supply of 40nm processor. Without actuall SP counts some articles said that top performers would be using 100 watts (as opposed to more than twice that with the 65nm process) while mainstream enthusiasts could see a 9600GT equivalent using 47 watts. The 55nm green 9600GT already has already lost its extra power connector. It gets everything right off the motherboard connection (spec is 75 watts max).

The GT214 was suppose to come out as early as late March. But other sources are saying that besides economic slow downs and large inventories the 40nm GPUs will be delayed because the manufacturing process is more a less one big clusterfart, having the same problems as the 55nm is having now. TheInquirer is banging the drum the loudest and it sounds like they would be more than happy to hand NVidia the next Darwin Award.

Delays in 40nm GPUs will not only hurt FAH, but will also impact notebooks, which would be the logical place to lower power consumption.

Meanwhile TigerDirect (not my favorite seller) has the XFX 9600GSO card with 768MB for $80. This could mean some more points for very few extra watts. On this thread you will see the graph that shows the 8800GTS 320MB versus 640MB showing a decent point gain for just 3 extra watts.
http://foldingforum.org/viewtopic.php?f ... 09&start=0

I Googled NVidia 40nm and got more than I bargained for.

_________________
People who put money and political ideology ahead of truth and ethics are neither patriots nor human beings.


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 28 posts ] 

All times are UTC - 8 hours


Who is online

Users browsing this forum: No registered users and 0 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group