The generation of heat is a critical factor in a data centre. When temperatures grow too high, equipment can overheat and fail costing thousands of dollars for every minute of downtime.
That puts a lot of pressure on the efficiency and reliability of the cooling system. It is estimated that 40% of a centre’s energy requirements are dedicated to cooling – as much as the processing of data itself – while data centres are responsible for the consumption of 1% of globally available electricity.
Generally, compressors, circulation pumps and condenser fans are key pieces of equipment involved in the cooling process. However, there are several types of system employed in data centres with the technology evolving rapidly as demand grows and the need to be as energy efficient as possible rises proportionately.
High pressure mist systems are the most established. These involve high pressure nozzles used to mist incoming air. This mist then evaporates dropping the temperature by up to 30°F.
For example, Facebook used an evaporative system when it built its 147,000 sq ft data centre in Prineville, Oregon for the Open Compute Project. The facility was designed to operate at 80.5°F – often below the outside air temperature of the desert region.
A MeeFog system consisting of 56 7.5hp positive displacement fog pump units with variable frequency drives was installed. Two pumps serve each of the data centre’s 28 air handling units, one on active duty, the other on standby, with automatic switchover. These pumps direct water to an array of impaction-pin nozzles where it is converted into a fine fog designed to rapidly evaporate, bringing the air down to the desired temperature and providing the required level of humidity.
Tests showed that for every 100 watts going to the computing equipment, only six watts was being taken up by cooling, lighting and power distribution.
Google also uses an advanced evaporative system at its Belgian data centre. Sustainability is also key for the company with the system drawing grey water from a nearby industrial canal.
Immersion cooling is a newer technology involving submerging hardware in special dielectric liquids such as mineral oil that conduct heat but not electricity. This liquid cools as effectively as 1200 cubic feet of air and requires significantly less energy input.
Infrastructure management provider 4D Data Centres is using immersion cooling technology known as a ‘pod’ with client, cloud services specialist PeaSoup, colocated at 4D’s data centre near London’s Gatwick airport. The deployment will enable PeaSoup to provide a high-performance computing cloud service, the Eco-Cloud.
For the installation, the ‘pod’ uses a biodegradable dielectric fluid and heat exchangers to cool down IT equipment. The fluid is kept cool by using intercoolers and water via an internal heat exchanger that extracts heat from the fluid and redistributes it into chill water, which is subsequently pumped away and cooled down again in adiabatic cooling towers.
Jack Bedell-Pearce, CEO at 4D Data Centres, said: “Aside from reducing risks of overheating, immersion cooling’s efficiency means it is able to cool high-density computer systems without increased power consumption. It also represents a greener alternative to other cooling methods which will use more power to cool the same amount of processing capability.”
Microsoft is also using immersion technology at its data centre in Quincy, Washington. A steel holding tank packed with computer servers also contains a liquid designed to boil at 122°F, which is 90°F lower than the boiling point of water.
The boiling, generated by the work the servers are doing, carries heat away from the computer processors enabling them to operate continuously at full power without risk of failure due to overheating.
Inside the tank, the vapor rising from the boiling fluid contacts a cooled condenser in the tank lid, which causes the vapor to change to liquid and rain back onto the immersed servers, creating a closed loop cooling system.
Microsoft has seen demand for faster computer processors continue to accelerate with the advent of high-performance applications such as artificial intelligence. Consequently, the computing industry has turned to chip architectures that can handle more electric power. That power generates more heat, ramping up the need for advanced cooling.
Christian Belady, engineer and vice president of Microsoft’s data centre advanced development group, said: “Air cooling is not enough. That’s what’s driving us to immersion cooling, where we can directly boil off the surfaces of the chip.”
Microsoft found two-phase immersion cooling reduced power consumption for any given server by up to 15%. The system also enables increased flexibility for the efficient management of cloud resources. For example, software that manages cloud resources can allocate sudden spikes in data centre computing demand to the servers in the liquid cooled tanks.
“For instance, we know that with Teams when you get to 1 o’clock or 2 o’clock, there is a huge spike because people are joining meetings at the same time. Immersion cooling gives us more flexibility to deal with these burst-y workloads,” explained Marcus Fontoura, a technical fellow and corporate vice president at Microsoft.
Microsoft is also developing Project Natick, which is exploring the potential of underwater data centres that are quick to deploy and can operate for years on the seabed sealed inside submarinelike tubes without any onsite maintenance by people.
Instead of an engineered fluid, the underwater data centre is filled with dry nitrogen air. The servers are cooled with fans and a heat exchange plumbing system that pumps piped seawater through the sealed tube.
A key finding from Project Natick is that the servers on the seafloor experienced one-eighth the failure rate of replica servers in a land data centre. Preliminary analysis indicates that the lack of humidity and corrosive effects of oxygen were primarily responsible for the superior performance of the servers underwater.