• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

block to block heat transfer?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Valk

Member
Joined
Oct 31, 2002
hey guys. ive been out of the computing scene for a while now.. i built my last system in 2007 i believe, and ive been rocking that with almost no maint since.

Im looking to replace the c2d stuff this year with amd fx83xx stuff which will also promote a bump in my cooling system to take the extra 6 cores lol.

right now i run a pretty typical inline system with a T

-swiftech apogee 1U
-Lian D5 w 6 years continuous
-BIP3 w 3x panaflo 120x38mm fans @7volts. very quiet.

cooling just the cpu alone i run the c2d6750 at stock clocks at ambient at all times. full load even. with my old 7900 gt in that loop it would somtimes go a few degrees over ambient.
I havent had a chance to test my radeon 5850 in that loop as id have to modify my aquaextreme mp1 to fit. i probably will never get rid of that block, just find other uses for it by modifiying the holddown. same with my apogee 1u.

seems there hasnt been too much development in liquid cooling block technology other than the mass marketization of it so who knows.

I do know for sure ill be dumping a LOT more heat into the new system so at the very least ill be adding a second rad and running them in series. this does increase the duty on the main pump though and im considering the following to keep overall flowrates high and temps down.



soooo. not sure if anyone has messed with this idea in the desktop scene.. im going to build a new desk/workstation as i FINALLY have a office room in my house. i dont go to lanparties and i rarely work on my computers unless stuff is broken. this liquid cooling has run 6 years almost non stop with nothing more than a top off on the T line a few times/year.
absolute reliability swiftech!

so the new system will be built into the desk in some way. I will probably design a testing bench sytel mainboard tray in the desks top most drawer space with a space below for all the water cooling components.
the trick, i want to be able to remove/service/install new hardware without upsetting the watercooling system or having to drain/rebuild it as i have had to in the past with the inline series system most people run.

disributedcooling_zps3fabd6d0.jpg

essentially i want to run dual loops. a system that will simply circulate hot and cold water between two static resivoirs.
The components to be cooled will draw cool water from the cool res, and deposit it to be cooled in the hot res. this would mean each device needs its own pump and loop setup for this which is no big deal.
I was thikning of using an apogee drive 2 for the cpu as this is half the work done right there. the other systems, gpu and mainboard could have their own loops with one pump between them but i might adapt an apogee drive pump to work in this environment as well by making a new sandwich plate for the pump and the gpu block.

simply plumbing this to work is no big deal, the trick is to make it serviceable without being super awkward to add/delete parts from ect.

im thinking of using a block to block transfer for all the loops. having one big block being cooled by the rad system which could only use one res then, and have all the other systems tap into that block with smaller blocks of their own. what im curious about is the efficiency of transfering heat between two water systems via copper blocks.
the most efficient is to obviously draw from the res itself, but if i could just unbolt the cpu loop block from the overall system block and pull it off without any leaks, isnt that great for future servicing?

just ideas being thrown around at this point. i might even investigate the posibility of using peltiers to transfer heat between two peices of copper. allowing the heat to pass from the component block to the main system more efficiently. though pelts arent that efficient for energy consumption. they also induce almost double the heat load as they only transfer as much heat wattage as you put in in energy wattage.


OBVIOUSLY, just sticking to a simple series system with a monster pump is the easiest cleanest way to do it. but im thinking back to all the many hours of building, bleeding systems in the past. its no fun with a T line system.
 
How about instead of an overly complex system, you just run quick disconnects so that you can easily remove and replace stuff?

What you have should work, I'd guess, but it's a lot of extra parts, cost, points of failure, etc. I think you can achieve the same effect just using quick disconnects.
 
the only problem with that is everything still needs to run in series and anywhere there is a disconnect will develope air bubbles when you hook it back up. though running common res and disconnects to each seperate peice would work.. ill look into them.
only ones ive seen are the koolance ones though where were like 1/4"? i dont want the compression fittings either, those leak hcore.
 
the only problem with that is everything still needs to run in series and anywhere there is a disconnect will develope air bubbles when you hook it back up. though running common res and disconnects to each seperate peice would work.. ill look into them.
only ones ive seen are the koolance ones though where were like 1/4"? i dont want the compression fittings either, those leak hcore.

There are a couple of koolance ones, see if you can find skinnee's review of them. You don't really get air bubbles, and if you're running a res anyways, then your bubbles will clear there pretty much instantly. You'd have to top off the res every time you change something, but if you get a big res (most desk builds have huge reses for looks anyways), even that won't be much of an issue.

Also, I'm not sure who told you that compression fittings are bad, but they're just fine...you seem to be misinformed on that point.
 
They may me ok but i wouldnt trust them. Clamp and barb for me all the way. Though for this build i might see if incan get tbolt clamps in these small sizes.

I will look into the disconnects though.

Project prob wont be as complex in the end. Ill prob plumb it all as per usual but leave much more slack in the lines than i have in he past.
 
They may me ok but i wouldnt trust them. Clamp and barb for me all the way. Though for this build i might see if incan get tbolt clamps in these small sizes.

Not liking/trusting them is one thing, stating that they leak badly is an entirely different issue. I just wanted to make sure you had good info.
 
I'm a fan of multi-pump builds. I like the redundancy and failover protection. However, what you've laid out adds complexity and multiple failure points without increasing cooling performance or failure protection.

You don't need three pumps for that much loop. You really just don't. Once you get over 1.5 gpm the performance curve starts getting REALLY flat. I would advise doing it all serial with a double pump setup.

As for the disconnects, they don't all use compression fittings. The most recent Koolance units are available with with male (screws into the block directly), female (screw your barb of choice into it), OR compression fittings (the traditional QDC configuration).

http://koolance.com/qd3-ffg4-quick-disconnect-no-spill-coupling-female-female-threaded-g-1-4
 
Last edited:
Im actualy really liking those fittings. Wonder how restrictive they are.
When i get home ill maybe sketch something else up.
Id really like to have the gpu loop be seperated from cpu/chipsets. With those couplings i could probably achieve that without much of the complexity.
 
Awesome. Thank you!
This thread was just to toss ideas around. Not trying ruffle feathers. Im sure the compression fittings do work well i just wasnt impressed with the ones i tried with swiftech. I really like mechanical fasteners wherever possible.

This build might end up just a bigger version of what i already have. Tonight im gonna measure for the desk. Basically this will be a cabinet between two desk halves. It will have a glass top allowing vision of the mainboard compartment. All liquid cooling will be hidden in a front bezel with the drives and power supplies.

This will all be accessible and pull out on full extension drawer slides. The disconnects would make for a cleaner transition from the mainboard case to the cooling case. When i finishthe solidworks sketch ill put it up and start a project thread probably.
 
Im actualy really liking those fittings. Wonder how restrictive they are.
When i get home ill maybe sketch something else up.
Id really like to have the gpu loop be seperated from cpu/chipsets. With those couplings i could probably achieve that without much of the complexity.

Why not have 2 completely separate loops, 1 for the CPU and a separate one for the GPUs?
 
Why not have 2 completely separate loops, 1 for the CPU and a separate one for the GPUs?

This. If you're going to have separate loops, having a shared reservoir and radiators defeats the purpose. Either do two completely separate loops, or do one big serial loop. Combining the two eliminates the advantages of either while keeping the disadvantages of both.
 
alright. you guys are right. thats kind of a retarded idea lol but looking at that qdc site shows the DDC being a stronger pump than a d5? or was that a low speed d5? i run my adjustable d5 at full all the time.
i might ahve to try to crack the case open and resolder a lead to it though as the wires seem to be fraying from moving it around a lot. they dont use very good quality wire in computers it seems. my rc wires are much much better.

k, so this is simpler.

cooling2014_zps33fc9d62.jpg

some of you may question the need for the QDCs for a simple loop like this. this will allow me to remove the liquid loop for the cpu/vrm/chipset ect or the gpu if one of them acts up without taking the whole loop apart. then i can insert a dummy tube to allow the rest of the setup to work while being serviced. when my 7900gt came out i had the block for it just sitting in the case cooling nothing.

I can attach the qdcs to the case partition making for a clean look too.
 
alright. you guys are right. thats kind of a retarded idea lol but looking at that qdc site shows the DDC being a stronger pump than a d5? or was that a low speed d5? i run my adjustable d5 at full all the time.
i might ahve to try to crack the case open and resolder a lead to it though as the wires seem to be fraying from moving it around a lot. they dont use very good quality wire in computers it seems. my rc wires are much much better.

k, so this is simpler.

cooling2014_zps33fc9d62.jpg

some of you may question the need for the QDCs for a simple loop like this. this will allow me to remove the liquid loop for the cpu/vrm/chipset ect or the gpu if one of them acts up without taking the whole loop apart. then i can insert a dummy tube to allow the rest of the setup to work while being serviced. when my 7900gt came out i had the block for it just sitting in the case cooling nothing.

I can attach the qdcs to the case partition making for a clean look too.

Stock D5 beats the stock DDC, but a topped DDC beats a topped d5 (and thus a stock D5).

In terms of removing components and keeping the loop connected, if you are consistent with the direction you put the QDCs in (in relation to flow) you could just have the empty QDCs connect to each other (eg: you could go from pump->QDC(male)->QDC(female)->CPU->QDC(m)->QDC(f)->GPU to pump->QDC(m)->QDC(f)->GPU by simply removing the CPU).

Also, that method of thinking would eliminate the need for the "jumper" in your diagram.

On a side note, cooling the chipset/vrms is pretty unnecessary, but if you wanna do it, go for it.
 
chipset and vrm are in there for illustation. i probably wouldn't bother with a vrm block but overheating chipset is what killed my C2D. the only fan in the system to blow air around NOT on the rad failed and the chipset nearly melted lol.
it survived but has never been the same since. became EXTREMELY termpermental.

i could definitely take out QDCS to make it simpler. the idea here though is to have the male ones fixed to the case so the female ones butt into the case side when connected. then all 4 qdcs will sit parallel with each other, which i think looks nice.


ill have to look into these "topped" water pumps. i guess you just buy a ddc and a custom top for it? a non right angle inlet top?
 
It was a different story back in the C2D days, but CPU voltages and currents are so much lower now that the power stages going in have to work way less and they're no longer a weak point...the smaller transistors inside of the processors, however, are a different story.

As long as you know it's not necessary, that's fine. Have you thought about how you're going to attach the QDCs?

In terms of pumps and pump tops, your best option is probably the MCP35X, which is a PWM DDC that comes with a top already.
 
The xspc res top looks nice. Two birds with one stone.
They make one for he d5 but not sure how the barbs work.
All the fitings are on the pump body so it fills from the bottom?

Prob get the ddc variant. Time for a new pump for sure.

After a think, tired think you are right about not needing the jumper. Ill just have to label the lines so i dont forget which one goes where in the desk.

Had a look around at desk projects. Mine probably wont be the full top kind of thing. Ill probably just have a cabinet between two desk halves with he machine in it. Tucked neatly into a corner even with a glass top for board viewing.
Had originally wanted to have a vertical tower pullout but then you wont really see the gpu or anything.

Ill have to see whats practical with the space. Got wuite a lot of glass i can use.

This will be a professional cabinetry job. Its what i do =) probably stained maple a1 veneer finish.
 
The tops replace the stock barbs with holes threaded for g1/4 fittings.

Make sure you post pics :D
 
I and many others here run the 8350s and run them hard, none of us uses a motherboard block, we just pump the vrms with air.
I have been looking at a heatkiller board block for mine and so far I still see no reason to go with a motherboard block.
 
ill keep that in mind. a mainboard block would be for show mostly i guess. this desk wont be a tranditional hollow top desk you guys have seen. honestly i dont want to show off that much stuff. just the mainboard compartment and whatever is going on in there. all the psu/liquid cooling and hdd sections will be discreately tucked away under a covered portion of the cabinet and you will only see the mainboard and gpu. I probably wont even light it up. just a discretely visible computer board mostly.

but all this will only take up the top drawer of a cabinet. i had planned to make a vertical tower pullout but ill have to see. i have some ideas for things i can do for sure and want to do something different.
 
Back