• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Talk to me about processors and memory affecting GPU folding

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

torin3

Member
Joined
Dec 25, 2004
I'm planning an upgrade to my main system. Currently I'm on a Socket 1151 CPU (6700K) and 16 GB of DDR4 RAM. Folding with 4 GPUs, my PPD seems a little low for what adding up my 4 cards shows it should be.

Plus anytime I do anything remotely demanding my PPD drops through the floor.

I'm currently looking at server motherboards, possibly dual CPU systems to get 6 or 7 cards.

I'm thinking the server motherboards/CPUs will handle load balancing a lot better (though I might have to get a server OS to take best advantage of that). I'm hoping that if I do some gaming it will only affect the GPU I'm using for my displays.

Will this plan help my performance at all? If so, how much memory should I get for 7 GPUSs, and are there any PSUs that can handle this, or will I need to run 2 or 3 PSUs to handle the power load?
 
T, I can't help on the setup...but if it were me. There is no way this wouldn't be a dedicated folding rig...Run 5 GPU's and use the best one on a separate gaming PC. For the price of the 7th GPU you could pretty much build an Alder Lake PC.

Why are you doing an all in one?

As far as the 1151 system is it not the lanes slowing it down? Tho I would add 16GB's of RAM as that would be the cheapest way to eliminate the RAM as throttle point.

I would look forward to seeing a 6 GPU setup tho. Clearly it is something I am not familiar with. Kinda a glanced at mining setups , but I have so much typical parts laying around I just am doing 2 a PC.

What is the major advantage?
 
Last edited:
T, I can't help on the setup...but if it were me. There is no way this wouldn't be a dedicated folding rig...Run 5 GPU's and use the best one on a separate gaming PC. For the price of the 7th GPU you could pretty much build an Alder Lake PC.

Why are you doing an all in one?

Wife limitations. Though I might switch this out for my garage system, but I would have to space out upgrading both over a year or so.
As far as the 1151 system is it not the lanes slowing it down? Tho I would add 16GB's of RAM as that would be the cheapest way to eliminate the RAM as throttle point.

I would look forward to seeing a 6 GPU setup tho. Clearly it is something I am not familiar with. Kinda a glanced at mining setups , but I have so much typical parts laying around I just am doing 2 a PC.

What is the major advantage?

The 1151 isn't slowing down because it has an extra chipset chip supplying extra lanes, but there might be some slowdown due to the limitations of the CPU.

The advantage is mainly due to my system limitations imposed by my wife.
 
Wife limitations. Though I might switch this out for my garage system, but I would have to space out upgrading both over a year or so.


The 1151 isn't slowing down because it has an extra chipset chip supplying extra lanes, but there might be some slowdown due to the limitations of the CPU.

The advantage is mainly due to my system limitations imposed by my wife.
Gotcha, In that case just looking forward to answers to your questions and the parts you put together. You got me thinking though about a 4+ rig...albeit it has to do PPD as well as separate PC's.

Truth be told I am less interested in PPD...It is more about doing something interesting with hardware I am not familiar with...I can only golf and do yard work for so many hours a day...My wife can't make it into the basement (Mancave) so I do not have the immovable obstacle..hehe;)

Is a Mobo like this typical for the application? I would limit it to less than max to avoid PSU expense. I have to research how that is handled.

 
Last edited:
Well I ordered a 2x16GB memory kit. It should give me an idea if more memory will help a lot or not.
 
This Gigabyte MW51-HP0 Server Motherboard might be what you want. Newegg has it.


2017081811564526_m.png
 
Last edited:
Gotcha, In that case just looking forward to answers to your questions and the parts you put together. You got me thinking though about a 4+ rig...albeit it has to do PPD as well as separate PC's.

Truth be told I am less interested in PPD...It is more about doing something interesting with hardware I am not familiar with...I can only golf and do yard work for so many hours a day...My wife can't make it into the basement (Mancave) so I do not have the immovable obstacle..hehe;)

Is a Mobo like this typical for the application? I would limit it to less than max to avoid PSU expense. I have to research how that is handled.


The problem with that motherboard is there is only one memory slot, and all but one of the PCI-E slots run at x1 speed. It will severly cripple your PPD to use it.

You need at least x8 speed to avoid an impact on folding output.
 
Best to have one real core per gpu folding. The 6700K has 4C/8T, so any other activity on that host will impact the GPU folding. All the GPU are running at x8 Gen3, but there are only 16 PCIE lanes and some extra overhead for sharing the PCIE lanes at x8 between 2 GPU.

Suspect that folding with Linux would be more productive than Windows, especially on the small atom count projects.

Perhaps a dedicated linux host folding 24/7 and a windows machine for gaming and folding.

I have two P8Z77V-WS boards in my linux farm. One board has a pair of 1080Ti running at x16/x16Gen3, but they are slightly slower folding than another pair 1080Ti running on a P8Z77V-LK at x8/x8Gen3, even though both have same 3470T CPU and 16GB DDR3 ram. The other WS board has 4xGPU at x8Gen3 with e3-1265Lv2 and 32GB DDR3 ram, but they are also slightly slower folding than as dual GPU on x8/x8Gen3 board.
 
Well, I went with a SuperMirco Server board from @ihrsetrdr. It only has x8 physical slots, but I checked to see that adapters and risers are available. I've got 2 risers I know will work (x16 shielded), so I can check to make sure the new ones are actually working before buying a full set.

This will be a good test bed to make sure it actually does what I want. It is an older board, and I'll probably upgrade, but only after I verify it is a viable setup.
I'll be upgrading my garage system, and moving over most of my cards to it. I'll run Linux on it for better PPD.

I'm going to probably get this case: https://www.amazon.com/Kingwin-Professional-Cryptocurrency-Convection-Performance/dp/B07H44XZPW/

Since it uses 20x20 aluminum extrustions I can easily modifiy it if I need to to make everything fit.

My only real concern is long enough riser cables that are shielded enough to avoid interference.

I'm thinking that 2 of the EVGA 1.6KW power supplies will be enough for up to 7 3080Ti cards in it.

Dual CPUs should deal with making sure there are enough PCI-E lanes. This board handles 3 x8 slots per CPU.

I'll report back as I build it up.
 
You know that your going to make it very hard for me to stay in front of you, don't you :rofl:
 
Last edited:
I'll ask a related question:
Gen4 and RTX3/4k series cards.
The consensus around here has always been you need at least a x4 connection to a GPU and one free thread to fold at full speed and not starve it of data.

How is this anecdote holding up relative to the RTX cards. Do the bandwidth requirements scale with the performance? For example, will I be starving a RTX4k series card by putting it on a x4 gen3 connection?
 
Anecdotally, in regards to 2 GPU's (Which is all I am running, 2 per PC) It makes no difference. My 10 year old MSI in sig is currently 3 mill better than the newer PC both running 2 3's because of the WU's. Ideal WU's get the same 14 tp 15 mil ppd. Which at that point the 2 3080's are more productive than the 3080/3070 Ti PC.

These are not hard numbers as I am not checking TPF in the logs. I also see no gains on the 5.0/4.0 slots on the #0 PC currently running 1 3080. Linux PC's do better, that PC is the only one running Windows and it never would do 15 mill with 2 GPU's
 
The server MB will be arriving tomorrow, and the open air x8 GPU case that I'm planning on using for it just dropped in price a bit, so I ordered it. It will also arrive tomorrow. I'll probably try and get it set up this weekend with at least 2 GPUs.

(Crap, got to get some riser cables too)
 
The server MB will be arriving tomorrow, and the open air x8 GPU case that I'm planning on using for it just dropped in price a bit, so I ordered it. It will also arrive tomorrow. I'll probably try and get it set up this weekend with at least 2 GPUs.

(Crap, got to get some riser cables too)
if all your going to put in it is 2 for now, you can just plug them in the MB till you get more GPU's :)
 
The server MB will be arriving tomorrow, and the open air x8 GPU case that I'm planning on using for it just dropped in price a bit, so I ordered it. It will also arrive tomorrow. I'll probably try and get it set up this weekend with at least 2 GPUs.

(Crap, got to get some riser cables too)
T, Can we get some links to what you got? :cheers:
 
Ok, you already had the Mobo...which one was it and have you liked it...

Ordering a cable....

Also would you by any chance now if there is a bracket to add a 3 GPU to a PC case...If I can't find one I'll fab it.

EDIT: Turns out , I guess they are common...


 
Last edited:
Back