• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Back to bring the PAIN!

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

gsrcrxsi

Member
Joined
Feb 21, 2008
Location
Baltimore, MD
looks good!

your host is currently in 60th place in terms of RAC. will probably come up to around 30-35th place once the RAC levels off
 
OP
Voodoo Rufus

Voodoo Rufus

Powder Junkie Moderator
Joined
Sep 20, 2001
Thanks!

If that saves 5-10sec on getting the units ready, that might push it to 200k or so.
 
OP
Voodoo Rufus

Voodoo Rufus

Powder Junkie Moderator
Joined
Sep 20, 2001
Hmm, the little tweak didn't seen to ramp RAC much. Still doing a steady 160k.
 
OP
Voodoo Rufus

Voodoo Rufus

Powder Junkie Moderator
Joined
Sep 20, 2001
After a driver update, I'm doing mid 175-185k now. Had my first 198k on the rig yesterday. About to make all my top10 days in 2019 now. First 200k+ day yesterday.

Planning on maintaining full power until I get to 50 million credits, and have my Boincstats RAC intersect my Berkeley Seti RAC. Then I'll go back to dual booting and getting other stuff done in the evenings, like gaming.
 
OP
Voodoo Rufus

Voodoo Rufus

Powder Junkie Moderator
Joined
Sep 20, 2001
Ok, I kinda broke it.

Installed new nvidia drivers from the ubuntu auto update to 430.50 and now my GPU is no longer crunching.
 
OP
Voodoo Rufus

Voodoo Rufus

Powder Junkie Moderator
Joined
Sep 20, 2001
How much power are you bringing back? Trying for 1M credits per day again?
 

Holdolin

Member
Joined
Apr 3, 2019
Location
Atlantic Northeast
I'm bringing it all back. 2 1080Tis, 5 2070s and 3 2080Tis. I have plans for the 4th 2080Ti which will go in the newest build, prometheus which houses 1 right now. I debated cramming 4 cards per box but between the power draw and heat generation I figured it just easier to build dual-card boxes. Oh, for grins and giggles I have my Windows workstation crunching on it's 1060. All i really do on it is write code which is more CPU intensive anyway so why not.
 
OP
Voodoo Rufus

Voodoo Rufus

Powder Junkie Moderator
Joined
Sep 20, 2001
Just curious, how do you affordably power all these rigs?

My power costs in California are absurd. The only reason I'm crunching is because I installed a 5kW solar setup this spring that helps offset it.

Only have my single 2080Ti but I might add another.
 

Holdolin

Member
Joined
Apr 3, 2019
Location
Atlantic Northeast
It's not cheap, spending ~200/mo to power them, though this time of year the heat they're generating would be needed anyway. Next year I plan on setting up a 10kw solar plant. Not looking forward to the heat come summer though.
 

gsrcrxsi

Member
Joined
Feb 21, 2008
Location
Baltimore, MD
if you want 4+ cards, you either need water cooling, or space them out appropriately.

My 10-GPU system (all 2070) are a little close, but airflow is enough to keep all the cards under 80C so that's good enough for me.

My 3-GPU system (2080ti, 2080, 2080), I have them watercooled as the cards are right next to each other.

I want to build a 7-GPU with all cards watercooled using one of those EK X7 block connectors. 7x 2070s/2080s in a nice tight package sounds cool. hard to find 1-slot IO cards these days though.
 

gsrcrxsi

Member
Joined
Feb 21, 2008
Location
Baltimore, MD
Yeah it’s on a mining rack. But I’m using a crazy server MB to get 8x PCIe lanes to 8 of the GPUs. 1x lane to the other 2, worried about power from the motherboard with more.

Pics:
https://imgur.com/a/XCNisLV

This link is a little older from when I had 1080tis, but shows the motherboard and risers
https://imgur.com/a/By4tOS5

7-GPU system: (6 GPUs on 1x riser, pics taken before 7th GPU added)
https://imgur.com/a/PJPSnZl

3-GPU watercooled system: (server case, rack mounted, external pump/rad)
https://imgur.com/a/Hbf1Apj
 

Holdolin

Member
Joined
Apr 3, 2019
Location
Atlantic Northeast
Yeah it’s on a mining rack. But I’m using a crazy server MB to get 8x PCIe lanes to 8 of the GPUs. 1x lane to the other 2, worried about power from the motherboard with more.

Pics:
https://imgur.com/a/XCNisLV

This link is a little older from when I had 1080tis, but shows the motherboard and risers
https://imgur.com/a/By4tOS5

7-GPU system: (6 GPUs on 1x riser, pics taken before 7th GPU added)
https://imgur.com/a/PJPSnZl

3-GPU watercooled system: (server case, rack mounted, external pump/rad)
https://imgur.com/a/Hbf1Apj

Ok, so what MB are you using to get 8x to 8 GPU slots? Been looking around the web and found some that have that many slots but none were clear weather I could use all 8 slots (most said they could use 4 cards at full x16). For that matter, what case? The more I think about your design, the more I'm thinking it would be better to just "consolidate" as many cards as I can into as few a rigs.
 

gsrcrxsi

Member
Joined
Feb 21, 2008
Location
Baltimore, MD
The motherboard is a Supermicro X9DRX+-F. They have an X10 version also, it’s basically the same but uses more expensive Xeon E5 V3/V4 and DDR4 Ram. I stick with X9 boards and super cheap Reg ECC DDR3

It has a total of eleven [11] PCIe 8x slots. 10 of them are PCIe 3.0, and the top 11th slot is PCIe 2.0 (4x electrically). You can use all slots at full lanes at the same time. I only limited myself to 8 cards because I was a little worried about power draw from the PCIe slots. I’m using an EVGA power injector on the PCIe 2.0 11th slot, and normal USB risers which don’t use power from the slot for slots 9 and 10.

Specs here: https://www.supermicro.com/products/motherboard/Xeon/C600/X9DRX_-F.cfm

I do not have it in a case. I’m using it on a mining rack (shown in pics). Cases for this board do exist from Supermicro, but they are expensive ($1000+), and rarely show up on the used market, still expensive when they do. But if you want to put more than 5 GPUs you’ll need to do some custom stuff like risers or single slot watercooled cards. There’s no single elegant solution.

I keep hoping to find PCIe 3.0 (ie, shielded) riser cables that have external power connections. But it seems no one wants that except me lol