• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Using Tygon silver tubing questions

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

DonquixoteIII

Registered
Joined
Jul 25, 2020
For my new build I intend to use water cooling, and for the tubing I intend to use Tygon's Silver Antimicrobial tubing, 1/2 x 3/4. I would rather not use barbed fittings if possible but I understand that compression fittings and 'soft' tubing sometimes doesn't work. Bitspower premium fittings are advertised to work on some Tygon tubing, but I wonder if that is just a marketing thing...

Thanks for any information.

Don Quixote III
 
Sometimes the tubing companies don't have the most precise OD measuring so compression fittings don't fit well. That said, I've used multiple tubing types with both Monsoon and Bitspower fittings with no problem. I've had it be a challenge to get the tube in but it fit.

Not sure if you've had bad experience s with barbs but a barb with clamp is plenty secure so don't count them out.
 
I would use primochill advanced lrt tubing. It has the least issues with erosion and deposits in the tubing and blocks. I prefer compression fittings as they look the best and are the more "foolproof" way of doing it. Hose clamps have a greater tendency to leak vs compressions in my experience (2 leaks with barbs and clamps, and only one very small leak with a compression that was not tightened properly by mistake. Either way you need a power supply jumper to leak test with. Always leave ALL power connections disconnected. Wet components will dry, fried components will not un-fry. I suggest reading out watercooling sticky as it has much valuable information.
 
Tygon silver 1/2in x 3/4in and compression fittings are no harder or easier to use than any other hose. Get some gloves or youll pay hell tightening the fittings tho. Its also no better than non anti-microbial tubing in keeping growth at bay. Youll still want to use some kind of biocide with it. I had issues with white crud in my tubing with the tygon (silver and black) so i switched to lrt and that took care of the crud.
 
Tygon silver 1/2in x 3/4in and compression fittings are no harder or easier to use than any other hose. Get some gloves or youll pay hell tightening the fittings tho. Its also no better than non anti-microbial tubing in keeping growth at bay. Youll still want to use some kind of biocide with it. I had issues with white crud in my tubing with the tygon (silver and black) so i switched to lrt and that took care of the crud.

Worked in a laboratory setting for five years and NEVER had 'white crud' issues, no matter the fluid (or gas) running through the tubing. I am currently looking at algaecides, but my fav (cupric sulfate) might have issues long term with brass. When I find a safe one I will be using it.
 
Not sure about PrimoChill's expertise at the moment, given they don't mention brass in any of their 'safe with' comments. I have a query in to them asking this. By the way, here is a good chart to use for various chemicals and how compatible they are with various metals.

https://www.industrialspec.com/resources/chemical-compatibility

On a side note, how is your SSD raid working out? I note that you are using the Sabrent pcie gen 4 SSDs in PCIE gen 3 board.. Supposedly compatible, but thought I'd ask
 
Last edited:
https://www.primochill.com/collecti...-90-degree-rotary-elbow-fitting-silver-nickel
Seeing as their, and practically all water cooling fittings for that matter, are made of brass I would say that there is no issue in using brass........

The drives are working great. On this platform it required a physical key made by intel to enable the raid. Obviously I am not getting the full potential from the drives being that they are on 3.0 instead of 4.0, but that is not of much concern. It is still plenty of speed and the difference in cost was negligible.
 
Regarding PrimoChill - I just received a response from them in which they do not directly address the issue, so I have sent then a more pointed query. The would not be the first mfg. to sell people products that are not compatible... Or safe. Ford and Firestone come to mind... Primochill advertises itself as a 'family run' business, but this does not mean that they don't have beancounters and liability lawyers giving them advice. Yeah, I am cynical when confronted with 'secret formulas'. I learned from a Romany friend.

As to the drives, that is good news to me as I plan on using the same board in my new build. Personally, though, I plan on waiting for drives with faster controllers than the Phison controllers. I like to call them 'PCIE Gen 3.5' as they tend to leave 1/3 of the speed on the table. Specificly. the Samsung gen 4 980, if it ever shows itself, should be a great drive. However, at this point at least one of the 'big data' controllers has to trickle down to the consumer space sometime soon. Either that, or I have to go towards the SAS route - And that means bigger bucks than I would like to spend.
 
Last edited:
Regarding PrimoChill - I just received a response from them in which they do not directly address the issue, so I have sent then a more pointed query. The would not be the first mfg. to sell people products that are not compatible... Or safe. Ford and Firestone come to mind... Primochill advertises itself as a 'family run' business, but this does not mean that they don't have beancounters and liability lawyers giving them advice. Yeah, I am cynical when confronted with 'secret formulas'. I learned from a Romany friend.

Every single fitting on the market is brass (save a few plastic or aluminum ones), and most are just nickle plated. brass is simply copper and zinc alloyed. The vast majority of radiators for water cooling are copper, so there is no need to worry about it really. So you are not creating a situation where galvanic corrosion would be a concern. They are all similar metals (copper, brass, nickle, and zinc) with very close atomic numbers. Where you run into issues is having metals that have atomic numbers that are further apart. It is similar to electroplating in many regards, as one metal will be attracted to the other when in electrical contact (think of a magnet, the stronger the opposing polarity "atomic number" the more attraction it will have) breaking down and creating a deposit on the other. There is more than just that to it (series of reactivity and such), but I will leave the chemistry lesson for you to search out if you are that interested. These products have been used for years in custom loops. I would think that someone wold have spoken up if there were any "compatibility issues".... As for their additives, maybe it is a proprietary mixture and they don't care to share it with others. I have used it for years without issue, and that is all the proof that I require.

As to the drives, that is good news to me as I plan on using the same board in my new build. Personally, though, I plan on waiting for drives with faster controllers than the Phison controllers. I like to call them 'PCIE Gen 3.5' as they tend to leave 1/3 of the speed on the table. Specificly. the Samsung gen 4 980, if it ever shows itself, should be a great drive. However, at this point at least one of the 'big data' controllers has to trickle down to the consumer space sometime soon. Either that, or I have to go towards the SAS route - And that means bigger bucks than I would like to spend.


I see the controller and pcie 4.0 as a non issue for the new drives. The theoretical max bandwidth of gen 4 is 31.5 GBps on x16 slots. But controllers that are capable of taking advantage of the speed are likely several years away from production. Cpu's are limited on the amount of pcie lanes, so you would have to find one that has the lanes to spare, and sacrifice an x16 slot to the card. Which means less lanes for GPU's and such. so all that just to run 20Gbps vs 15.75. I would wager that you would not notice much difference even at the full 4.0 16x speed in anything close to realistic real world use. So unless you are doing massive rendering, encoding, virtualization or such I would say it is a waste to have this kind of speed and the money could be better spent on other components.
 
It is not galvanic reactions that I am concerned over, it is more or less straight corrosion, Brass is very susceptible to corrosion, almost as much so as 'soft' iron. So cupric sulfate is out, as is H2O2. And those are the most common algaecides on the market. So, yes - I am concerned about what I put into what may turn out to be a 3 to 4k NZD PC, BRFORE adding water cooling bits. I would not be surprised to see my next PC hit upwards of 6k to 7k NZD before it is finished. I would be a fool NOT to be concerned about what went into it.

Regarding PCIE gen 4, Intel would be wise at this point to start enabling more channels of PCIE on their newer CPUs, and while they are at it make them PCIe gen 4. Data centers thrive on fast storage, the faster the better, AI thrives on fast GPU's as well. And both need more PCIe lanes For this next build I will be using possibly 48 lanes, if I decide to go with two GPUs, and 32 lanes if not, and those lanes are just for the GPU(s) and SSDs. Intel just doesn't make such boards in the consumer space, let alone with PCIe gen 4. And they wonder why they are losing the data center race...
 
Regarding PCIE gen 4, Intel would be wise at this point to start enabling more channels of PCIE on their newer CPUs, and while they are at it make them PCIe gen 4. Data centers thrive on fast storage, the faster the better, AI thrives on fast GPU's as well. And both need more PCIe lanes For this next build I will be using possibly 48 lanes, if I decide to go with two GPUs, and 32 lanes if not, and those lanes are just for the GPU(s) and SSDs. Intel just doesn't make such boards in the consumer space, let alone with PCIe gen 4. And they wonder why they are losing the data center race...
PCIe 4.0 is coming to Intel Rocket Lake.

DC's and consumers are completely different beasts. Consumers don't need smoking fast storage and with most use cases, wouldn't notice the difference between a SATA SSD and NVMe based. Consumers don't AI either... that is DC type stuff. Not typically a concern here at the consumer level.

EDIT: Part of the reason I think they are 'losing' teh DC race is due to the high core count and power use of the Epyc CPUs in that space. I believe Intel still has a majority there, but it's share is falling as time goes on and life cycle management shows up to update servers. It's always a bit behind the curve.
 
Last edited:
Finally got an answer back from Primochill that is not written in marketing-speak. I can now proceed with some certainty.

As to Intel and the datacenter, they seem to have made the decision about five years or so ago that they had no competition, (and they really didn't) so why ramp up the R&D? But now they have TWO major sources of competition - AMD and also ARM, hitting them from two sides: AMD, on the high power end and ARM on the low power (in watts and prices) end. Sucks to be Intel right about now, I hope that they still have that excellent R&D group over in Isreal. Perhaps they could also cherry-pick some of Samsung's and TSMC's foundry staff... Or, just use their foundries...
 
It is not galvanic reactions that I am concerned over, it is more or less straight corrosion, Brass is very susceptible to corrosion, almost as much so as 'soft' iron. So cupric sulfate is out, as is H2O2. And those are the most common algaecides on the market. So, yes - I am concerned about what I put into what may turn out to be a 3 to 4k NZD PC, BRFORE adding water cooling bits. I would not be surprised to see my next PC hit upwards of 6k to 7k NZD before it is finished. I would be a fool NOT to be concerned about what went into it.

Regarding PCIE gen 4, Intel would be wise at this point to start enabling more channels of PCIE on their newer CPUs, and while they are at it make them PCIe gen 4. Data centers thrive on fast storage, the faster the better, AI thrives on fast GPU's as well. And both need more PCIe lanes For this next build I will be using possibly 48 lanes, if I decide to go with two GPUs, and 32 lanes if not, and those lanes are just for the GPU(s) and SSDs. Intel just doesn't make such boards in the consumer space, let alone with PCIe gen 4. And they wonder why they are losing the data center race...

I suggest that you go and read this
https://en.wikipedia.org/wiki/Brass

Brass is very resistant against corrosion. It is why it is used in plumbing fixtures, engine bearings, gears, by the navy for salt water systems. It is much more corrosion resistant than plain copper or "soft iron as you say. I don't blame you for wanting to be cautious in your venture into watercooling a potentially expensive computer, but there is no need to reinvent the wheel is all I was saying. These are long established and proven technologies. It is not like the olden days of aquarium pumps and heater cores from cars.

As far as Intel, I completely agree. They became too complacent in their position. They are a large enough company to weather their situation though. Hopefully they remove their head from their hind quarters and hit the ground running.
 
Back