SPCR Folds Team Blog

A forum just for SPCR's folding team... by request.

Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee

ryboto
Friend of SPCR
Posts: 1439
Joined: Tue Dec 14, 2004 4:06 pm
Location: New Hampshire, US
Contact:

Post by ryboto » Tue Jan 27, 2009 6:52 am

aristide1 wrote:You may want to consider just testing 3.2GHz, because it will leave the motherboard running at stock/normal frequencies.

Biostar T boards do not undervolt, I need to find one that does.
So, the StressCPUV2 stability test I did doesn't imply the OC is good?

aristide1
*Lifetime Patron*
Posts: 4284
Joined: Fri Apr 04, 2003 6:21 pm
Location: Undisclosed but sober in US

Post by aristide1 » Tue Jan 27, 2009 7:02 am

I'm not saying that, all I am suggesting is that 3.2 may actually be more stable, even though that may seem counter intuitive.

I don't think any one test can in fact do a 100% test of everything going on. I've encountered different temps and even different power usage on different FAH WUs.

ryboto
Friend of SPCR
Posts: 1439
Joined: Tue Dec 14, 2004 4:06 pm
Location: New Hampshire, US
Contact:

Post by ryboto » Tue Jan 27, 2009 7:09 am

aristide1 wrote:I'm not saying that, all I am suggesting is that 3.2 may actually be more stable, even though that may seem counter intuitive.

I don't think any one test can in fact do a 100% test of everything going on. I've encountered different temps and even different power usage on different FAH WUs.
In my power testing, StressCPUV2 causes the system to consume more power than F@H or Prime95 smallFFTs. Still doesn't mean it stresses all the same components, but it's using the same Gromacs core the SMP client uses. If I find the time, first order of business is solder a new molex power lead for my DA-2 brick, then I need to get my power meter on my desktop again, and then I could try 3.2ghz.

aristide1
*Lifetime Patron*
Posts: 4284
Joined: Fri Apr 04, 2003 6:21 pm
Location: Undisclosed but sober in US

Post by aristide1 » Sat Jan 31, 2009 7:15 am

The guy who runs the website

http://folding.extremeoverclocking.com/

is hinting that it's time to change the color groups again, since too many people are bright red. Don't let this bother you, rather look at it as a challenge.

We continue to have too many people stop folding. :?

ryboto
Friend of SPCR
Posts: 1439
Joined: Tue Dec 14, 2004 4:06 pm
Location: New Hampshire, US
Contact:

Post by ryboto » Sun Feb 01, 2009 8:22 am

So, I visited the folding forums, and they said it might be a bad WU, after all the stability testing I did. So, I tried the client 3-4 more times, it kept grabbing the same exact WU, finally, it got a different clone/gen, and now it's churning away like normal. Never seen a bad WU before...

aristide1
*Lifetime Patron*
Posts: 4284
Joined: Fri Apr 04, 2003 6:21 pm
Location: Undisclosed but sober in US

Post by aristide1 » Sun Feb 01, 2009 8:28 am

ryboto, they usually show up in groups, they come and go. SMP has a batch right now. The GPU batch seems to have subsided.

aristide1
*Lifetime Patron*
Posts: 4284
Joined: Fri Apr 04, 2003 6:21 pm
Location: Undisclosed but sober in US

Post by aristide1 » Sun Feb 01, 2009 5:48 pm

Anybody know is 73C is too hot for a 8800GS (aka 9600GSO)?

NeilBlanchard
Moderator
Posts: 7681
Joined: Mon Dec 09, 2002 7:11 pm
Location: Maynard, MA, Eaarth
Contact:

Post by NeilBlanchard » Sun Feb 01, 2009 7:19 pm

Hi,

It doesn't seem like it's too hot. I thought that that GPU can get up to 90C with no worries?

aristide1
*Lifetime Patron*
Posts: 4284
Joined: Fri Apr 04, 2003 6:21 pm
Location: Undisclosed but sober in US

Post by aristide1 » Mon Feb 02, 2009 11:20 am

Neil I have 2 identical cards in 1 box, both with the shaders cranked up. One card starting getting a lot of errors, while the other one is error free. I was suprised to find both cards running around 75C. So I maxed out the fans, which didn't help much. I then took the problem child back to default values. It cooled and has worked correctly since.

I'm going to put that big arse Artic Cooler on it tonight and try to return the shaders to where they were before. In the summer my room is going to get a heck of a lot warmer that it was yesterday.

aristide1
*Lifetime Patron*
Posts: 4284
Joined: Fri Apr 04, 2003 6:21 pm
Location: Undisclosed but sober in US

Post by aristide1 » Sat Feb 07, 2009 7:16 pm


NeilBlanchard
Moderator
Posts: 7681
Joined: Mon Dec 09, 2002 7:11 pm
Location: Maynard, MA, Eaarth
Contact:

Post by NeilBlanchard » Tue Feb 10, 2009 11:14 am

Hiya,

I just got the Deino SMP client to run in WinXP here at work!! Woohoo! So, it'll get ~1300PPD instead of something less, that the single client yielded. I guess I got lucky -- I ran the Install.bat, and set up the stuff it asked for, and then ran the SMP client. We have been having server issues, and this necessitated starting over for my user settings; this may have been the difference?

aristide1
*Lifetime Patron*
Posts: 4284
Joined: Fri Apr 04, 2003 6:21 pm
Location: Undisclosed but sober in US

Post by aristide1 » Tue Feb 10, 2009 12:14 pm

Deino a PITA to install. I don't even try anymore.

I wish GPU folding worked on W2K.

aristide1
*Lifetime Patron*
Posts: 4284
Joined: Fri Apr 04, 2003 6:21 pm
Location: Undisclosed but sober in US

Post by aristide1 » Tue Feb 17, 2009 10:52 am

Who said no news is good news? Certainly wasn't a folder. I Google "NVidia 40nm" on a regular basis, some say new arrivals in March, other say everything is at least one 1Q late. The place where NVidia buys silicon wafers, their business is way off. On top of all that NVidia still has 65nm GPUs. I guess that's why Asus and EVGA now has GTX260 that's clearly marked 55nm. Who wants to play the guessing game with a $200 video card?

To add to the frustration NVidia wants to bring the lowest of the low GPUs into the 40nm tech first, apparently for the notebook market. It seems notebook sales are nowhere near as off as desktop PCs. Still a single
GTX295 dual core sells for about $500, you need to sell a lot of cheap GPUs to make up for one of those.

By the way we can blame the dirt bag bankers for this slowdown. What a slimy bunch.

aristide1
*Lifetime Patron*
Posts: 4284
Joined: Fri Apr 04, 2003 6:21 pm
Location: Undisclosed but sober in US

Post by aristide1 » Tue Feb 17, 2009 8:10 pm

E6400 + 8800GS 120watts @ idle
E6400 + 8800GS 140watts SMP only
E6400 + 8800GS 180watts SMP + GPU OD to 2.8GHz/1800 shaders


X2 2.8GHz with integrated video 110 watts
X2 + 2 8800GS 150 watts at idle
X2 + 2 8800GS 190 watts 1 GPU FAH no SMP
X2 + 2 8800GS 235 watts 2 GPU FAH no SMP
X2 + 2 8800GS 295 watts 2 GPU FAH + SMP slight OC to 3GHz/1800 shaders. Wow that's just way too many watts for what little an X2 can put out. Time for some adjustments again.

aristide1
*Lifetime Patron*
Posts: 4284
Joined: Fri Apr 04, 2003 6:21 pm
Location: Undisclosed but sober in US

Post by aristide1 » Tue Mar 17, 2009 7:58 am

If this isn't the most hysterical thing I've heard about folding in a while.

Apparently the lastest integrated video, like the NVidia 9100 chips, can run GPU folding. With 16 shaders (whoohoo!) it takes an entire day to complete a single 435 point WU.

What's really amazing is the PPW, with integrated video using about the same wattage as a night light.

aristide1
*Lifetime Patron*
Posts: 4284
Joined: Fri Apr 04, 2003 6:21 pm
Location: Undisclosed but sober in US

Post by aristide1 » Tue Mar 17, 2009 10:51 am

Uh-oh, DasMan must have bought some new equipment, he's starting to pull away now. :shock:

floffe
Posts: 497
Joined: Mon May 08, 2006 4:36 am
Location: Linköping, Sweden

Post by floffe » Tue Mar 17, 2009 2:25 pm

aristide1 wrote:Apparently the lastest integrated video, like the NVidia 9100 chips, can run GPU folding. With 16 shaders (whoohoo!) it takes an entire day to complete a single 435 point WU.

What's really amazing is the PPW, with integrated video using about the same wattage as a night light.
That's roughly 3 times the PPD of my AthlonXP 2500+, nothing to sneeze at :P

Wibla
Friend of SPCR
Posts: 779
Joined: Sun Jun 03, 2007 12:03 am
Location: Norway

Post by Wibla » Wed Apr 15, 2009 6:07 am

I accidentally started GPU folding on a GTS 250... :D 4000-5000 PPD ftw.

NeilBlanchard
Moderator
Posts: 7681
Joined: Mon Dec 09, 2002 7:11 pm
Location: Maynard, MA, Eaarth
Contact:

Post by NeilBlanchard » Wed Apr 15, 2009 8:13 am

Hi,

For some reason, both my brother's and my iMac's stopped working on about the 8th of April! Something may be going on with their servers? I restarted mine (twice) and now it seems to be working again...

And my machine here at work has gotten to 87% done on about 6 units in a row -- and then it ends! :evil:

aristide1
*Lifetime Patron*
Posts: 4284
Joined: Fri Apr 04, 2003 6:21 pm
Location: Undisclosed but sober in US

Post by aristide1 » Wed Apr 15, 2009 9:06 am

Join the crowd. Now I can't install any drivers, CUDA or otherwise, access denied, including Admin.

Win D'Oh!s can be such a pain.
NeilBlanchard wrote:...And my machine here at work has gotten to 87% done on about 6 units in a row -- and then it ends! :evil:
WOW :!: Neil, did you post in this problem over at the folding forum or does someone else have the exact same problem :?:

NeilBlanchard
Moderator
Posts: 7681
Joined: Mon Dec 09, 2002 7:11 pm
Location: Maynard, MA, Eaarth
Contact:

comparison to SETI@Home

Post by NeilBlanchard » Mon May 04, 2009 8:31 am

Hello,

Before I Folded, I ran SETI@Home, and I recently restarted SETI on an old (XP2100+) machine. They now use the BOINC manager, and it can combine using the GPU along with the CPU. This makes a lot of sense to me -- I wonder if F@H will start to do something like this?

aristide1
*Lifetime Patron*
Posts: 4284
Joined: Fri Apr 04, 2003 6:21 pm
Location: Undisclosed but sober in US

Post by aristide1 » Mon May 04, 2009 2:14 pm

If you have an NVidia card and newer drivers, which free up the cpu, you will have no problems running both. I've been doing it for like a year on my Vista system. XP was problematic several drivers ago, its OK now.

ATI cards, for some reason, continue to use lots of CPU

The systray versions has some minor priority tweaks, but I use Bill2's Process Manager. SMP is always the lowest priority, GPU has to be slightly higher. I still have some slight 2D delay and stuttering, but not as bad as before.

aristide1
*Lifetime Patron*
Posts: 4284
Joined: Fri Apr 04, 2003 6:21 pm
Location: Undisclosed but sober in US

Post by aristide1 » Tue May 26, 2009 6:54 pm

Euimin is our #1 folder at the moment and he doesn't even have 1 million points yet.

Clearly not every one needs a mentor. :shock: 8)

It's been quite a friendly competition as of late.

Anybody notice the team graph? In February we touched upon 1 million points per week, lately it's 200,000 per day. We even have teams that we may pass, which hasn't happened in quite a while. Of course many others are riding the GPU trend so many will still pass us. Seems the gamers are easing up a little.

aristide1
*Lifetime Patron*
Posts: 4284
Joined: Fri Apr 04, 2003 6:21 pm
Location: Undisclosed but sober in US

Post by aristide1 » Sat May 30, 2009 4:41 pm

Well, my 1 GPU finished a WU no problem, then the next 6 WU's immediately went EUE, so the thing shut down for 24 hours. Swell.

I'm taking the pc off-line to clean the dust out of it since 1 gpu wont be folding anyway. It currently holds 2 gpus.

My dumb ass MSI K9A2 board probably needs a BIOS update. I'm allowed to change memory timings but none of them stick. Even MemSet doesn't work right. Idiots.

KansaKilla
Friend of SPCR
Posts: 381
Joined: Fri Aug 01, 2003 12:13 pm
Location: Rochester, MN

Post by KansaKilla » Sat May 30, 2009 10:48 pm

i've been getting a lot of those eue's lately. don't know if it's a bad batch or what.

posted over at the folding forums, but no real love there. lots of assholes posting, though. too much vitriol for my taste.

i have some suspicions about the gpu client. first is that it doesn't handle switching between users very well at all. eue's and all that. second is that it will kill a wireless connection after a certain period of time, say 5 min or so. both of these are based on personal experience.

it really would be nice to have a high-point value client that was set and forget.

NeilBlanchard
Moderator
Posts: 7681
Joined: Mon Dec 09, 2002 7:11 pm
Location: Maynard, MA, Eaarth
Contact:

Post by NeilBlanchard » Sun May 31, 2009 4:03 am

What's an "EUE"?

aristide1
*Lifetime Patron*
Posts: 4284
Joined: Fri Apr 04, 2003 6:21 pm
Location: Undisclosed but sober in US

Post by aristide1 » Sun May 31, 2009 8:16 am

EUEs are a type of error encountered by gpus. They indicate instability, though like SMP sometimes a batch of WUs comes out and they all fail. If there are too many in a short period of time the GPU goes off-line for 24 hours, that's coded in the FAH software.

KansaKilla
Friend of SPCR
Posts: 381
Joined: Fri Aug 01, 2003 12:13 pm
Location: Rochester, MN

Post by KansaKilla » Fri Jun 05, 2009 6:30 am

anyone else having problems with continued work on the gpu client after switching to another user? if my wife uses my computer then i have to switch back into my logon (even though i haven't logged off) for the client to complete or else it will finish a wu and just sit there.

cordis
Posts: 1082
Joined: Thu Jan 15, 2009 10:56 pm
Location: San Jose

good news!

Post by cordis » Fri Jun 05, 2009 3:50 pm

aristide1 wrote:If this isn't the most hysterical thing I've heard about folding in a while.

Apparently the lastest integrated video, like the NVidia 9100 chips, can run GPU folding. With 16 shaders (whoohoo!) it takes an entire day to complete a single 435 point WU.

What's really amazing is the PPW, with integrated video using about the same wattage as a night light.
Hey, that's actually pretty good news, once I upgrade my htpc with my old quad core, I'm going to have a spare core duo, and I was thinking of using that Asus P5N7A-VM board http://www.silentpcreview.com/article892-page1.html to replace my via c7n, and hopefully I'll be able to keep running it off the pico-psu. I've been poking around various forums looking for confirmation that the 9300 on board would fold, thanks for the info! With the cpu folding along with the gpu, might get up to ~2000ppd. For a pico-psu that would be pretty cool!

aristide1
*Lifetime Patron*
Posts: 4284
Joined: Fri Apr 04, 2003 6:21 pm
Location: Undisclosed but sober in US

Post by aristide1 » Fri Jun 05, 2009 7:31 pm

I think its great and I wish more people did that than to stop contributing. Our membership count is dropping like flies. :?

We'll I finally got around to installing FAHMon on pc #2. This is interesting.

PC 1 has a single GTX 260, 216 processors, 896MB memory, 55nm.

PC 2 has 2 9600 GSOs, total 192 processors, 768MB memory, 65nm.

PC1 gpu versus PC2 gpus:
Both average 6K to 7K PPD, depending on WU.
Both use about 120 watts, the TX 260 may use 10 watts less.

Despite the 24 processor and extra memory advantage its typically about equal. I save a PCI slot, that's it.

The downside is PC3 with Vista 64 and 2 more 9600GSOs. Based on the PPD I see on the first 2 PCs the third PC is not doing as well. In fact I noticed little change after installing the second 9600GSO. Time to put FAHMon on that PC as well.

Oh project 6911 (1888 points) will take 7.5 hours to complete on a GPU. Frankly that's awful.

Post Reply