Liquid Cooling- What do I need to know? (and other advice needed)

OK, I’m going to be building a new gaming rig in the next month or so, and am considering doing a liquid cooling system. Umm…what do I need to know? Is installation any more difficult than a fan? Is there upkeep (do you ever have to replace the liquid?) Is it quieter than a fan-cooled system? Which systems are the best? Should I just forget it and go with an ultra-quiet fan (which one?)?

Thanks to Brian Rubin’s thread below, I think I’ve narrowed down my choices a bit.

Case: Not sure, depends on what all I have to fit.
Motherboard: ASUS A8N32-SLI Deluxe
CPU: Bouncing between AMD FX-60 and AMD’s X2 4800+
Memory: 2 Gigs of somethingorother
Graphics: either dual GeForce 7800 GT or a single 512meg 7800 GTX
Sound Card: some X-Fi card

Finally, the ASUS board, I understand it has 3 PCI slots, 2 PCIe slots and 1 PCIe short bus slot. Are these positioned so that the slots can all be used? Specifically, if I go with the dual graphics cards and the sound card, will I not have any room for things like a video capture card (would like to do some video editing on this machine) or a TV tuner card (would also like to watch Passions…or…umm…yeah…)?

Thanks for all the advice that I just know will be flooding in…
-AM Urbanek

The guy at my local screwdriver shop was telling me that liquid cooling is just about the same noise levels as a (quieter) fan pc, because of the pumps and such.

I just recently built, used, and sold a watercooled PC using those components (X2 3800+ @ 2.6, 7800 GTX 512, same motherboard).

With that setup (assuming you water-cool both the video card and CPU), you would do well to make sure you get a double 12-cm radiator. The GTX 512 and CPU combined to saturate the single-fan radiator pretty rapidly, needing a single fan to spin faster than you’d want it to be for quiet use. I switched to a nice double fan setup (using Innovatec’s larger radiator option, though Swiftech and others offer them too) and that worked very well.

Do not get the Thermaltake Bigwater 2 water cooling kit (especially the GPU block option); while the kit is overall of very high quality, the retention mechanisms are of poor design and you could easily damage your GPU core trying to get it tight enough not to twist (although their pump and drive-bay reservoir are both very good).

Innovatec is probably still my favorite all-inclusive kit.

WC in general is a pain in the ass when filling and (especially) when changing / modifying / disassembling your cooling loop. Something like a Koolance Exos might be good to consider for ease of installation and acceptable performance.

The pump makes a different noise from the fan. Less disturbing, imho.

If you’re going to show off your watercooling innards, don’t buy Clearflex tubing. Buy Tygon tubing. Clearflex is half the price of Tygon tubing, but Clearflex tubing turns always turns cloudy and makes your tubing look gross.

Don’t. Water cooling is expensive and dangerous. It’s also unnecessary. Water cooling is the raid0 of hardware modifications. Foolish.

Hey so,

seven years later, is liquid cooling still a terrible idea? Every single build I’ve seen online lately has been using it. Is there a good guide anywhere?

if you are building a SFF case and want the build to run at minimal noise levels (not the Apple version which is silent on idle and blow dryer on load), watercooling is pretty much the only option.

I’ve heard things like are really easy and effective and make it a much more viable option but haven’t tried it myself.

I do find myself considering overclocking lately. I never bothered in the past but as single core performance has plateaued so much… taking a chip from 3.5 to 5 ghz or whatnot is actually a much larger leap than any new tick or tock from Intel. Or multiple ticks. If you had an original i7 2.6 at 4.5 like many people did, it’s STILL faster than anything Intel sells now for single core.

I’m assuming you refer to a high powered beast of a gaming PC here? I haven’t looked into this recently but the last time I helped a friend put together a small PC for HTPC use it was small and air-cooled and quiet. All we did to get it quiet was use a passively cooled video card, a decently quite CPU fan, and a quiet PSU. I would think that’s still possible in a SFF PC if one only had moderate performance needs.

I personally think that’s a pointless endeavour since an android box can do pretty much anything an Atom/APU htpc can do, but maybe the ease of use is a factor (as in PCs have abigger selection of media software that works out of the box). Still, you’d be paying a triple premium over anything android for that.

What’s your purpose with liquid cooling? Is it about case noise or about overclocking?

A lot of the times I see it brought up it’s being used in a superfluous way that’s the equivalent of ricing a PC because people associate it with high-powered machines they heard about a decade ago.

Essentially there is a new way to do liquid cooling, “closed loop” CPU coolers. They come pre-assembled, all you need to do is attach the heatsink to the CPU, and attach the radiator somewhere on the case, and then attach a fan or fans to the radiator. The tubing and liquid in the tubing is already there, and the end user doesn’t need to worry about it. They make a pretty enticing alternative to air coolers, though it isn’t slam dunk better. They cost more, for one. Also creates a new failure point. On the other hand, liquid does conduct heat better than air, and unlike high end air coolers they put a lot less stress on one’s motherboard.

the only closed loop kit I’d consider buying would be the Swiftech H220. They disclose detailed info on everything they include in the loop - unlike Corsair,Astek etc. who give you some cheaply built aluminium radiator thing with who knows what inside.

could be the case, but a lot of the times? You should hang around more mature people maybe?

Here’s a comparison between a few overclocked core i7’s and some Xeons. The xeons lost out by a huge margin:

do you know how many core i7’s can fit in the price of a xeon? Well one if we count the 3960x, but then you’d have enough left over for ram and motherboard!

About the only test the xeons won out was in a heavily multithreaded simulation benchmark - and that only because two xeons were used. At $1440 per cpu, I’d rather just set up a render farm of overclocked, watercooled core i7’s!

full thing here:

Well, in fairness, Xeons are not designed to compete with i7. They are server products, often the same architecture but run cooler, at lower voltages, typically have more cores (up to 10), more threads, more cache, support multi-socket configuration, support ECC RAM, and much more of it. All things that are highly desirable for symmetric processing, virtualisation, and crunching large datasets in datacentres that need to be space, power and cooling efficient.

As to your render farm, did you miss the rendering tests in the same article that demonstrated the Xeon out performs on 2-pass x264. That’s obviously going to be down to the additional cores and threads.

Now put them in 2-way half height blades in a suitable blade chassis and the result is vastly superior rendering power in a much smaller footprint and lower power consumption. Whatever up front capital cost you may have saved will be burnt in power consumption in no time. Oh, and guess what, you’ll get support from the vendor if/when you have a failure, which you won’t do with your overclocked i7.

They are not apples for apples comparisons.

Edit - Whoops, I did just notice the render tests are with dual Xeon configs, but my point remains valid.

you mean the test where 2 xeons performed 95.89fps when one core i7 pushed 59.07fps?
Let’s see:
95.89 ÷ 2 = 47.945
So a single xeon theoretically gives me 47.945fps while a core i7 gives me 59.07fps. Yeah I’d rather have a render farm of core i7s.

To elaborate: I’m not concerned with the efficiency of 10000 node server clusters (even then google engineers have shown that cheap laptop hardware in large quantities and with individual li-on battery units make for a much more powerful and reliable system than a few hundred xeons backed up by a million dollar UPS unit). I’m only interested in the xon as a workstation machine.
As a workstation machine cpu it just doesn’t make sense if you are on a budget and you fund your own equipment. If someone else is footing the bill, sure get the expensive Mac Pro or Dell with the nice warranty. For those doing computational work on the side (remember some people are interested in computational power who might not necessarily be programmers or animators e.g financial and stock market analysts, radiologists, dentists, anyone interested in crunching large data or reading 3d scans etc.), an overclocked core i7 is quiet attractive since it’s actually much more powerful than a single xeon machine.

and when I mention render farms I actually mean two or three PCs networked together in my living room, not the kind of sophisticated operation you might be accustomed to in your day job ;) in that scenario power consumption is a non-issue.
(as in, 2 or 3 core i7’s is cheaper than one dual xeon workstation and much more powerful)

I’m only interested in the xon as a workstation machine. As a workstation machine cpu it just doesn’t make sense if you are on a budget and you fund your own equipment. If someone else is footing the bill, sure get the expensive Mac Pro or Dell with the nice warranty. For those doing computational work on the side (remember some people are interested in computational power who might not necessarily be programmers or animators e.g financial and stock market analysts), an overclocked core i7 is quiet attractive since it’s actually much more powerful than a single xeon machine.

Could not agree with you more! My point was only that i7 comparison to Xeon is not fair for a single user, desktop (or even these days workstation) style workload, it is not what they are designed or priced for.

when they hear someone talking about overclocking or watercooling, the usual excuse from old fashioned grey beards is usually “if you want more power, get a xeon” or “get a Mac Pro” when real world testing has shown otherwise.

Ah, I never hear that. Has to be a throw back from the days when server kit was legitimately more powerful than desktop stuff. Has not really been that way for some time. Enterprise kit advantages now lie in areas other than pure speeds and feeds.

Now, just for shits and giggles, let’s look at your i7 render farm vs Xeon from a TCO perspective. I threw this together quickly just for fun and to double check my train of thought was more or less accurate. Let’s say you were a small business providing outsourced rendering services. This is based on a fully loaded HP c10000 blade chassis (16 x 2-way = 32 CPU). Considering the i7 renders 30% faster per CPU, there are 30% less of them (22 CPU). By end of year one, costs are parity. Beyond that, Xeon represents significant operational savings. Note the caveats, there are, um, more than a few, but they seem reasonable. I am sure you’ll point out something obvious I have missed.

Feel free to discuss. I can send you the spreadsheet if you feel like a tinker.

Edit - No sooner than I posted that I noticed an error in the power calc for the i7 (I originally calculated equal CPU quantities so forgot to modify the kW usage when I changed it to 22). Amended now.

Okay, I guess I will get a closed loop system for my Prodigy.