It is currently Mon Oct 20, 2014 4:01 pm

All times are UTC - 8 hours




Post new topic Reply to topic  [ 9 posts ] 
Author Message
 Post subject: Dual CPU F@H run as service..
PostPosted: Wed Sep 10, 2003 3:51 pm 
Offline

Joined: Mon Jun 09, 2003 5:47 pm
Posts: 426
Location: London, UK
Hi,
I've got a query.. I'm trying to run the console version of F@H on a dual CPU box.. I'd really like to run it as a service (or as two services I guess).. but I've hit a snag.. the firedaemon method requires Administrator priveleges and I think I've only got Power User status.
I just found this which I'll try tomorrow but I'd really like to capture the text output for debugging... I'm guessing this second method won't save the text output to a file.

So, any ideas on how I could either install FireDaemon without admin rights, or to capture output of f@h directly installed as a service?

Thanks,
DonP.


Top
 Profile  
 
 Post subject:
PostPosted: Wed Sep 10, 2003 4:06 pm 
Offline
SPCR Reviewer

Joined: Sun Aug 11, 2002 3:26 pm
Posts: 3998
Location: Phoenix, AZ
A better solution would be this: OverClockers Australia F@H service guide

As for capturing the output, you don't have to do anything. F@H automatically outputs all of its text into a log file. Look in the directory that your F@H is running in for a file named FAHLog.txt

_________________
Senior Contributing Writer, SPCR


Top
 Profile  
 
 Post subject:
PostPosted: Thu Sep 11, 2003 5:13 pm 
Offline

Joined: Mon Jun 09, 2003 5:47 pm
Posts: 426
Location: London, UK
Big Thanks Rusty....
unfortunately I've hit a snag - it seems I can't edit the "Services" part of the registry.. so I've given up on running it as a service and I'll just run it from a script and/or "Startup" (just run it when a user logs in).

But now.. and this is the bit which makes me absolutely mad.. I can't get two instances to run.
From what I see I need to create two separate directories with one copy of the fah3console.exe in each. Then I run "fah3console -config" in one dir to create the config file. I use machineid=1 for this.. then I copy it to the second fah dir and in that config file I change machineid to 2.
Great so far, just like it says in the FAQ. Note I didn't just copy an installed directory over, I haven't done what is says not to do here. So, I've just two dir, each with a fah3console.exe, each with a client.cfg, differing by MachineID.

Now if I run one ("C:\dir1\> fah3console -local"), it works. Now if I start the second ("C:\dir2\> fah3console -local") the client.cfg of that instance ("C:\dir2\client.cfg") gets overwritten with some simple three line (incorrect) config. Once I replace C:\dir2\client.cfg and run it ("C:\dir2\> fah3console -local") then the config of the first one gets overwritten.

ARGH! I've browsed the web a bit (I gave up after an hour of fiddling) and it seems to me that the Stanford FAQs, "fah3console --help" and the few posts I've found regarding client.cfg are all about as useful as a serious heart attack.

Oh.. and one other problem.. I looked at the logs and it seems I have about 10 WUs (from running the graphical version) ready for upload but they cannot be uploaded... I'm connecting through a proxy server but I know that it's worked in the past.. and now after fiddling with the console clients it doesn't. I don't think this problem is specific to this dual CPU box though - another single CPU box hasn't been able to send WU either. I've tried three proxies. I checked the status of the server and supposedly they work.. but when I telnet to them (on port 8080) it connects and then just disconnects.

This whole F@H things is really infuriating with the really poor documentation and debugging features ("-verbosity 9" is useless).

Any ideas? am I missing something obvious? I'm pretty sure I've tried all combinations but maybe I've missed something.


BTW.. if I'm behind a proxy running two separate (say single CPU machines for now) F@H hosts then I don't need to fiddle machineid for each do I?

Thanks and sorry for the general grumpy tone,
DonP.

PS One last Q.. if I copy the "work" directory from the PCs which cannot return the results to a computer at home, which repeatedly has successfully returned results then will it work? Or do the files in "work" have something specific to the machine the were computed on?


Top
 Profile  
 
 Post subject:
PostPosted: Thu Sep 11, 2003 7:29 pm 
Offline

Joined: Mon Jun 09, 2003 5:47 pm
Posts: 426
Location: London, UK
I've been doing more browsing and I can't see a mention of the 2cpu problems I;ve been having so it must be specific to my setup - I;ll dig around more next week.

But if you have any idea on why complete WU aren;t being returned then I'd still like to know.. also the question about returning WUs on a different machine to which they were computed.

Thanks!


Top
 Profile  
 
 Post subject:
PostPosted: Fri Sep 12, 2003 7:14 am 
Offline
Friend of SPCR

Joined: Tue Jun 10, 2003 12:40 pm
Posts: 37
Location: Northern Virginia, USA
DonP wrote:
PS One last Q.. if I copy the "work" directory from the PCs which cannot return the results to a computer at home, which repeatedly has successfully returned results then will it work? Or do the files in "work" have something specific to the machine the were computed on?


On a few occasions I've successfully started a work unit on one computer and moved the work directory to another to finish it.


Top
 Profile  
 
 Post subject:
PostPosted: Fri Sep 12, 2003 10:37 am 
Offline

Joined: Thu Jun 19, 2003 10:26 am
Posts: 86
Location: Eastern USA
Have you tried simply deleting the client.cfg from the second directory and entering the info by hand using fah3console -config ?


Top
 Profile  
 
 Post subject:
PostPosted: Sun Sep 14, 2003 2:50 pm 
Offline

Joined: Mon Jun 09, 2003 5:47 pm
Posts: 426
Location: London, UK
Thanks guys for all the advice - I got the 2 CPU issue resolved - I think there may be an un-initialised variable somewhere or something - started to work fine after a reboot.. let's see how long it stays fine :)

About the uploading of results - I still can't do it... I've tried three different proxies (2*squid, 1*tinyproxy), one for which I could access the logs to.. the connection goes through but then no data is sent either way.. I also tried sending it through port 443.. hopefully the httpS port won;t cache and just create a clear tunnel.. but no luck.
I don't get it... and I'm a bit fed up of this hacking - it should really be a trivial thing.
It's a shame it's borked cos I have quite a few PCs behind a firewall and if I can't automatically upload results then I might as well start burning my CPUs to cure cancer rather than folding.
I've searched around for similar problems - I found a few threads and their resolution was just that it started to work magically.. I suspect the same will happen in my case since it _did_ actually work right at the start.

Anyway.. I'm going to carry on folding.. can;t wait for the extremeOC stats to return.. looks like we're 31st.. what can we do about SLO-TEch?

Regards,
DonP.

PS It looks like you CAN move finished WUs from the work directory on one machine to another.. and they are successfully uploaded.
EDIT:
PPS Actually... just tried shifting some more complete WUs again and it didn;t successfully transplant.. hmm :(
PPPS This isn;t the problem I'm having - I've looked at squid.conf and there is no limit on the request size. :(


Top
 Profile  
 
 Post subject:
PostPosted: Sun Sep 14, 2003 7:53 pm 
Offline
SPCR Reviewer

Joined: Sun Aug 11, 2002 3:26 pm
Posts: 3998
Location: Phoenix, AZ
DonP, I'm sure you've tried this, but just to be sure:

These machine do have WWW access through IE, right?

And if so, have you tried the "Use Interent Explorer settings" right?

Silly suggestions, I know, but I'm trying to help :lol:

_________________
Senior Contributing Writer, SPCR


Top
 Profile  
 
 Post subject:
PostPosted: Mon Sep 15, 2003 12:21 pm 
Offline

Joined: Mon Jun 09, 2003 5:47 pm
Posts: 426
Location: London, UK
Rusty075 wrote:
..And if so, have you tried the "Use Interent Explorer settings" right?


I have, thanks Rusty, but it doesn't help.
I did get a bit of a break today.. I noticed that a few WU recently got through.. and they were smaller than 1M.. so it looks like I am actually having this problem.
I didn't think it was that problem at first because on the dual CPU box where I was doing most of my testing on, it didn;t give me an error 413 (even with verbosity 9). I got a 413 today on another box. I also checked the config file of squid and there was no mention of request_body_max_size and the default should be no limit (I think).
So tomorrow I'm going to come up with a good reason for the squid admin to up the limit. :)

Thanks and sorry for my angry tone - I'm just annoyed that this isn't click-and-go.

DonP.


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 9 posts ] 

All times are UTC - 8 hours


Who is online

Users browsing this forum: No registered users and 1 guest


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group