• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

After a many year hiatus, folding_monkeys is back!

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Heck yeah! Might wanna grab some more cards though, my server has a current gen video card in it for the first time as of today thanks to BubbaTheHut. I never upgraded the vid card when I built this as I dont really game anymore, so was running one of my old gtx 970s until today (didn't even put the second in due to needing 2 x8 slots for my raid cards lol). Points are starting to roll in 😁
View attachment 366633

Is that 3 x 80mm fans? Did you have to buy a delorean to travel back to 2002 and pick them up?
 
Anyone know how much PCIE bandwidth effects PPD on new WUs and fast cards by any chance? Most of the data I can find is old and it's all over the place. KeeperOfTheButch and I are doing some tests currently but would be nice to have more data.
 
Anyone know how much PCIE bandwidth effects PPD on new WUs and fast cards by any chance? Most of the data I can find is old and it's all over the place. KeeperOfTheButch and I are doing some tests currently but would be nice to have more data.
Well you don't want to run a PCIe 3.0/4.0 card on a PCIe 2.0 MB like I did with 1 of my 4090's :rofl:
 
Anyone know how much PCIE bandwidth effects PPD on new WUs and fast cards by any chance? Most of the data I can find is old and it's all over the place. KeeperOfTheButch and I are doing some tests currently but would be nice to have more data.
Id stay away from a 2.0 system. 3.0 x16 or greater for anything modern to not put a significant glass ceiling on things.


but RTX4070 on Thunderbolt 3/eGPU makes about 30% less than on PCIE 4.0 x16,
What would that bandwidth transfer to in PCIe? Wonder if the usb/tb4 protocol has additional overhead along with less bandwidth.
 
Guessing that's why I'm seeing a bit low PPD on my 4070 super since it's stuck at pcie3 8x until I can move the raid card that's in the other x16 slot. Still doing around 11m PPD on certain projects though and a bit over 7m on others, so not sure if that's normal (Keeperofthebutch is doing more on his but his is an msi gaming x slim and mine is an Asus dual so maybe card differences?).
 
What would that bandwidth transfer to in PCIe? Wonder if the usb/tb4 protocol has additional overhead along with less bandwidth.

In theory, TB3 and TB4 have 40Gbps max. TB4 has some additional functionality. In reality, it's more like 3.2-3.8GB/s max, so somewhere around PCIe 3.0 x4 or 4.0 x2 bandwidth.
Hwinfo64 shows 2.5GT/s. GPU-Z shows PCIe 3.0 x4 on the TB3 connection.

Guessing that's why I'm seeing a bit low PPD on my 4070 super since it's stuck at pcie3 8x until I can move the raid card that's in the other x16 slot. Still doing around 11m PPD on certain projects though and a bit over 7m on others, so not sure if that's normal (Keeperofthebutch is doing more on his but his is an msi gaming x slim and mine is an Asus dual so maybe card differences?).

4070 Super gets about 1-2M PPD more than 4070, so your numbers are about as high as expected. During the last competition, I had 8-10M PPD on my 4070 overclocked a bit. The average that I see is closer to 8M PPD.
 
Last edited:
In theory, TB3 and TB4 have 40Gbps max. TB4 has some additional functionality. In reality, it's more like 3.2-3.8GB/s max, so somewhere around PCIe 3.0 x4 or 4.0 x2 bandwidth.



4070 Super gets about 1-2M PPD more than 4070, so your numbers are about as high as expected. During the last competition, I had 8-10M PPD on my 4070 overclocked a bit. The average that I see is closer to 8M PPD.
Good to know, thanks!
 
A little update. In some projects, I see 3.5-4M PPD on my RTX4070/TB3 connection. It's weird that I haven't seen it before, but I also wasn't running F@H on a TB connection for longer. I assume that the same projects also have lower PPD on a regular PCIe slot.
 
A little update. In some projects, I see 3.5-4M PPD on my RTX4070/TB3 connection. It's weird that I haven't seen it before, but I also wasn't running F@H on a TB connection for longer. I assume that the same projects also have lower PPD on a regular PCIe slot.
Yeah I've been tracking a few projects but haven't had time to dig through the logs. Here's what I have, sorry for the shitty formatting, was on my phone lol


PPD. Project. Wu points
7,759,260 p19217 (129321)
7360000 p14951 (143573)
10,910,000 p18220 (568237)
11225561 p12266 (1000491)
10,026,956 p12297 (550090)
10,339,944 p14950 (208235)
7,111,440 p14945 (148155)
622000 p17645 (166973)
618000 p17646 (167247)
6004860 p17645 (169136)
 
Yeah I've been tracking a few projects but haven't had time to dig through the logs. Here's what I have, sorry for the shitty formatting, was on my phone lol


PPD. Project. Wu points
7,759,260 p19217 (129321)
7360000 p14951 (143573)
10,910,000 p18220 (568237)
11225561 p12266 (1000491)
10,026,956 p12297 (550090)
10,339,944 p14950 (208235)
7,111,440 p14945 (148155)
622000 p17645 (166973)
618000 p17646 (167247)
6004860 p17645 (169136)
Those WU'S would be great for me.
 
You guys are in groove in now lots and lots of points. (y) :clap:
Thanks. I just flew down to Keeperofthebutch's house last night. He's got another surprise we're going to try to get up and running, but might be a few days if we can't get it working in his main rig. Motherboard doesn't recognize it when it's in one of the x16 slots (2060 works fine in that slot though) 🤷‍♂️
 
Got it figured out, more PPD coming online shortly!

1715694070857.png
Keeperofthebutch is getting the 2060 back in, then folding shall commence again!

1715694181147.png
 
Back