Archived from groups: alt.comp.hardware.overclocking,alt.comp.hardware.overclocking.amd (
More info?)
Water wetter reduces surface tension (surfactant), whilst having
anti-corrosion additives and such I recall it isn't that great at stopping
green things growing in the cooling system. This could result in the
spread of bugs to the machine itself
🙂
More seriously:
o Water cooling on a Dell is pointless as indicated
o Water cooling has a use if several machines re heat & noise relocation
Notice I say heat & noise relocation - it just moves it somewhere else.
o There are quiet air movement solutions for use within PCs
o A Dell has a reasonably good thermal system design from the factory
Some mid-range Dells use a cheaper JAC/JMC fan solution, better
higher-end Dells use quieter NMB fan solutions. The problem stems
partly from the fan quality, and secondly the fan specification chosen.
o The JAC/JMC solutions can be extremely high airflow
---- the temp v speed curve isn't adjusted to the fan their integrate
---- with the result that the machine in high ambient can be very noisy
---- a similar problem exists with some Apple G4s & other machines
o The NMB solution is actually of lower airflow & quieter bearings
---- the temp v speed curve is better suited to the fan they use
---- with the result that the machine in similar ambient is quieter
If you scream at Dell loud enough they should offer the NMB fan.
Water cooling does have some valid applications:
o If you have several PCs in a room, the heat input is very considerable
---- that is quite irrespective of the noise a number may make
o For 10x 1U Dual-CPU PCs it isn't inconceivable to suffer 3.5kW+
---- in a 28-33oC summer, even before room solar-gain, that is ugly
o Relocating that heat (& to some extent noise) is an ideal solution
---- colleague required just that for a 16x 1U solution at >5kW heat
Water cooling for overclocking has benefits - having displaced peltier
solutions in as much that it doesn't double the thermal load to remove.
You can use very large fans with modern heatsinks - eg, 120mm on a
CPU isn't impossible (and with Prescott it can be useful
🙂 so the use
of large fans with radiators is offset somewhat. That said, we still have
graphics card thermal output growth outpacing CPUs quite notably.
Present quiet graphics card solutions aren't that well thought out - the
space is limited, and many cards use low-profile fans which tend to
recirculate their own now heated air & so reduce cooling efficiency.
o Most CPU coolers can recirculate 40-70% of their own air
o Graphics card coolers can recirculate 60-80% of their own air
So getting heated air out of the case becomes more important - the CPU
& Graphics card do not see room-ambient, they see case-ambient. Dells
tend to use minimal heatsinks and a rear-mounted CPU fan with duct. So
if the machine runs hot (and it will with a hot graphics card) so that large
ducted fan will run somewhat high - depending on model, very noisily.
Some of the Dells use a single fan of 57-63dB(A) at maximum operating temp,
so keep those machines quiet requires careful choice of internal components,
or at least keeping the room ambient near the machine as low as possible.
Water cooling may eventually make a strong comeback for industrial use:
o Water cooling is unremarkable in high-end power supplies
---- it's usage is coping with thermal density, similar to Mainframe & hot CPUs
o ATX-blade servers offer a short-term thermal solution
---- blade-servers offer similar, but are proprietary locked-in solutions
o Several water cooled solutions are posed to coloco & rack infrastructure
---- slowly water is creeping closer to the CPU it seems
---- initially - local chilled-water air-drop outlets near rack-inlets
-------- this copes with the 42x 1U with dual-Xeon/Opteron thermal density problem
-------- underfloor ducted air-velocity is otherwise too high to guarantee rack temps
---- later - direct chilled-water to heat-exchanger within each rack itself
-------- this is getting cool air directly to the load
-------- and minimising the number of connections or service hassle re getting feet wet
🙂
---- eventually - Intel & others propose various chilled water backplane solutions
-------- few water connections again, but a copper backplane "bus" of chilled connector
-------- heatpipes on the CPUs pump heat to a copper connector on the rear of the case
-------- sliding the case into the rack also interfaces with the heat removal system
Cooligy have solutions posted at "direct" chip water cooling, beyond current heatpipes.
Thus water cooling is about heat removal - it still has to be got rid of somewhere else.
For racks the problem is servicing - racks comprise lots of computers which comprise
multiple
single points of failure, thus servicing is actually quite regularly required. Around
25-45% of
all servicing itself results in further downtime or servicing required to correct
errors - not that
uncommon for someone to unplug the wrong machine or take others down; human error. So
adding water solutions into the mix is something many will resist relatively strongly.
IBM posed a "water cooled data storage cube" some time ago, which could equally be
applied
to compute nodes frankly, since the density issue is beneficial although cost remains an
issue.
o The water cooling was more related to bulk heat disposal
---- using a chilled water plant which has high redundancy, often available at many IT
sites
o The real objective of the project was about data-management
---- removing data-management from teams of DP staff into smarter software which
self-managed
The trick with water cooling will be not getting feet wet.
o Water leakage in your bedroom PC is one thing
---- the epoxy will eventually not like being puddled in it, but PSU aside the voltages
are ELV
---- PCs will keep happily keep running, as both military & industry prove regularly
o Water leakage on a *suspended floor* is a different matter since it "has to go
somewhere"
---- Yes, contingency has long been worked out for that
---- However in multi-racks, multi-PCs you have risk-increase, revenue-density-increase
---- So an offlined rack due to water problem can take out several other machines - and
revenue
---- Particularly, you could have someone else's machine taking out your e-Commerce
server
So from a Service Level Agreement perspective the water solutions are "being tip-toed".
In some form they will come:
o Yes, Prescott cooks - but future Workstation CPUs will be based around P-M
architecture
---- that stops the CPU thermal arms race, but graphics cards will run unabated a bit
longer
o Server side cooks - and is likely to continue to do so
---- server CPUs are necessaily switching to sudden eco-low-power systems
---- compute power can mean revenue in many applications, and dbase have RAM/HD heat too
---- dual-core CPUs can off-load processing alternately to reduce the thermal-density
per time
For one thing, rack thermal outputs increasingly require suspended floor void airflow
that is not
easily solved (even with 48" underfloor voids) - or even practical in terms of air speed
required.
Many systems would require 60-80mph airspeed under voids and still have inconsistenct
rack
temp beyond the usual low/mid/top variation experienced. So water cooling isn't going
away.
For consumers, some graphics card cooling innovation would be welcome.
BTX isn't a great solution here - but it's a step in the right direction, albeit
somewhat less than
ideal for some of the very high thermal output graphics cards (if passive solutions are
posed).
Hard drives are progressing from 3.5" to 2.5" which will help thermal output somewhat,
with
15.3k-rpm 3.5" drives actually using 2.5" platters inside anyway. Laptops will move to
1.8".
So data-warehousing to the desktop PC will see other areas thermally improve over time.
Better modelling is already employed - modern Dells are well CFA/FEA thermal modelled
with
Flotherm and other 3D modelling s/w to better manage heat density, noise & retain
reliability.
Desktop P-M boards could expand beyond the industrial application area, which for SOHO
PCs & home servers may prove a seller - Athlon Mobile offers a lot of low noise/heat now
tho.
--
Dorothy Bradbury
www.dorothybradbury.co.uk for quiet Panaflo & NMB fans