• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Chimp Challenge 2013 - A Call to Arms - All Aboard the USS Poo Flinger

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Getting excited for the challenge. The 4p little beast is ready to go. Plan on bringing the home computer online...not many points but every point counts with an AMD 940 and GTX 260. Had enough parts to build a 3rd system, (after ordering a PSU, will be an old Intel core I7 860 with a Radeon 5870....:cheers:
 
Is the GPU fan controlled automatically or have you it set manually? I remember one time nvidia released a bad driver that caused the deaths of many cards as it caused the cards to over heat. Maybe try a slightly older driver or just sent the fan speed manually.

Will give this a look... although, I have a slot cooler blowing directly out from below it and a side fan blowing directly onto the cards.... so it almost wouldn't need a fan of its own for airflow to at least stay under 100c. I'm thinking it either almost has to be a voltage issue or that the heatsink is making poor contact after swapping it around. Not like the GT 620 is so costly that it's a disaster if it's dead, except for the fact I need to make sure I don't run out of ways to connect monitors to the CPU's I want to fold with that lack onboard video. :chair:

Am also getting excited like everyone else for the challenge. Been running at about 50% of the normal amount of folding capacity to take time testing various configurations and moving the cases to spots where they don't have any reason to need shutting down at any point during the competition. Losing some production in the short-term, but in the long-term should end up making it up with what's improved leading up to CC. Still bet I end up giving in and adding one more piece of hardware before it kicks off too....
 
First problem is I have never heard of your Linux distro. So working to correct a problem is simply an exercise in futility, for me.

Have you considered going back and starting it from scratch, and writing down your steps as you go, and using Ubuntu x64 Desktop version?

I followed instructions at the Ubuntu website the last time I folded an NVidia card on native Linux.

In any case, if you've taken all kinds of steps to make it work, and failed, then it's time to re-start it, with a better known distro, imo.

Even with Ubuntu, some versions are just much more difficult to set it up, than others are, because of the default files they load, etc.

Sorry I can't be of more help, but Linux requires a lot of time to learn it. I use it for specific tasks (native folding, VM inside Windows folding), and the odd spreadsheet, and that's about it. I'm certainly not a Linux guru of any sort.
 
just fired up fah on my 1050T, says 8200 ppd, is that good, bad or indifferent.

If the cores are running at 99% - 100%, then it's GREAT! ppd will vary widely, from project to project. Bit of a roller-coaster ride on that.

Great having you with us! :welcome:
 
First problem is I have never heard of your Linux distro. So working to correct a problem is simply an exercise in futility, for me.

Have you considered going back and starting it from scratch, and writing down your steps as you go, and using Ubuntu x64 Desktop version?

I followed instructions at the Ubuntu website the last time I folded an NVidia card on native Linux.

In any case, if you've taken all kinds of steps to make it work, and failed, then it's time to re-start it, with a better known distro, imo.

Even with Ubuntu, some versions are just much more difficult to set it up, than others are, because of the default files they load, etc.

Sorry I can't be of more help, but Linux requires a lot of time to learn it. I use it for specific tasks (native folding, VM inside Windows folding), and the odd spreadsheet, and that's about it. I'm certainly not a Linux guru of any sort.

the distro is a fedora frok. and yes I started back at step one many times
We have are own F@H team and one other member folds GPU that I know off.

If I recall right. I needed to link a cuda dll or something. But IDK. I'm killing my self for not writing it down :facepalm:

EDIT
not using Ubuntu. I don't want to hop right now. that and I can't stand it.
 
Last edited:
Ubuntu is a bit of an acquired taste, for sure. The advantage here is it's huge active user base. I've never even heard of a "fedora frock" as a distro classifier - great alliteration, and definitely descriptive.

I'd cut out the middle man, so to speak, and jump your problem right over to 1) Stanford FAH forum, and 2) Any large and active fedora forum. Should be a folder somewhere in the bunch.

Good luck.
 
Just a quick question, Im gone over the weekend, of course I'll keep my PC running. I wondered if I should be concerned with GPU folding, too. Usually I throw in a GPU WU every other day, would there be any harm letting my 580 twinfrozr folding at a 100% straight for the CC? I just don't want to return to a dead card ;)
Temp is 67°C @ 70% fan speed when folding.
 
Just a quick question, Im gone over the weekend, of course I'll keep my PC running. I wondered if I should be concerned with GPU folding, too. Usually I throw in a GPU WU every other day, would there be any harm letting my 580 twinfrozr folding at a 100% straight for the CC? I just don't want to return to a dead card ;)
Temp is 67°C @ 70% fan speed when folding.

Those temps are o.k. for the 580. Just keep the ventilation in your favor.:D
 
Tell me about it...I'm having the worse luck with pc components.

I don't think you intended it, but you actually were right with your choice of grammar here. The worst luck would seem to be mine... first I had one GPU possibly dead because the new driver from Nvidia got it up to 100c within sixty seconds of folding. Then less than 24 hours later, I lost two of the spring screws to mount the cooler for my 6850 that I'd tried (and failed) to improve the temps on with a spare heatsink I'd not been using. Plus, with all the shuffling computers around, I didn't have time to re-sleeve all the A/V cables for each to protect them from my cat- who loves wires and did indeed manage to make quick work of two HDMI cables and a DVI cable while I was having dinner....

Bah who wants to trade a couple of 7970's for three complete and (up to a few hours ago) fully functional systems so I can make this easier to hit my contribution goal? :p
 
lol, you're spot on :p
I won't complain that much, people have worse luck than me...
A couple of 7970s would do nicely in almost anything these days, be it coin mining, gaming, compute in general, heating your house in winter, you name it. :D

Oh, and don't forget poo flingin!! :p
 
Okay, I think that I let things snowball when I decided to try fixing the first couple issues while it was the middle of the night and I was tired without enough direct lighting.... :bang head

Right now, of the four rigs I fold some sort of processor inside.... I'm down to a 6870 stuck inside of a Dell that runs a Athlon 3700+ San Diego. 8350 is having issues with the display regardless of GPU card now, one Phenom 965 just cycles through without booting and the other has the GPU that is overheating inside. This is going to be a long day, but I swear I'll be ahead somewhere instead of down folding slots from normal before it's over... :mad:
 
Just a quick question, Im gone over the weekend, of course I'll keep my PC running. I wondered if I should be concerned with GPU folding, too. Usually I throw in a GPU WU every other day, would there be any harm letting my 580 twinfrozr folding at a 100% straight for the CC? I just don't want to return to a dead card ;)
Temp is 67°C @ 70% fan speed when folding.

Here's what I do:

For short trips, I left the GPU folding away. For longer times away (all day, overnight etc.), I left the cpu folding, but not the GPU.

GPU's are not up to the reliability of my other computer peripherals, particularly since it does get hot where I live.
 
GPU's are not up to the reliability
That's what I thought too. If I get my OC stable with the new RAM before I leave (I can't really test it since I encountered BSODs after up to 3 days, I'll bump the voltage a notch just to be safe) I have absolutely no problem with my CPU running 24/7. I don't really trust the GPU :) I think I'll throw in some GPU units via my mobile (teamviewer) to be safe :)
 
Back