November 23, 2023
Your Ultimate Guide to…
At Allotts, we know how critical 2024 trade shows and...
We explore the growth of data and what can be done to reduce the environment impact of storing this data in data centres. One possible solution is to use liquid cooling as it has already been successful at improving the sustainability of smaller data centres.
As numbers are floated by me that I no longer recognise, 10 to the power of another two digit number, I wonder who names them and if there is a naming convention we weren’t told about at school. So far, we are at Yottabytes, as if Terrabytes weren’t enough!
At the moment we can’t attribute a lot of carbon emissions specifically to the growth of data, but as data centre numbers increase, the energy needed to power them also increases. That power most frequently comes directly from the grid and the impact of that does vary from year to year, but the majority source is currently fossil fuels (40% in 2022). The impact of data centres on the grid has become more apparent, in fact in 2022 several housing schemes had to be shelved as there was no grid support for further housing as a direct result of the number of data centres nearby.
Why not build data centres in cold places if they need continual cooling? Well, they also need to be connected to a large power supply and able to transfer the data in and out of the servers to act as the cloud. Locating in the highlands of Scotland is therefore out of the question, and practically speaking, motorway networks near high data-consumption industries (eg. fintech) are the most likely build locations.
The power consumption of a single data centre is equivalent to thousands of homes, but without question they are a necessity in everyday life. Increasing devices with increasing power and data storage, plus the migration from 4G to 5G means that the internet would not function without them. But what is the solution for reducing their impact on the environment?
Typically, a data centre consists of rows of either mainframe style computers or racks of servers in rows, in a darkened warehouse. Each one of these servers or mainframes generates heat, and the more data they process, the hotter the chip gets. The data centres are frequently air-cooled by extracting the hot air and pumping it outside, replacing with cooler air using air-conditioning units which generally use HFC’s. In addition to this, data centres have to have back up supplies of power, (UPS) and further back-up power such as generators, often powered by diesel.
A data centre would have a 25-30MW data capacity, and consume around 30GWh of energy per year (and we all know that 1.21GW is a bolt of lightning…or enough to power the flux capacitor in 1985). Getting back to reality, 30GW can power more than 10 million homes for a year. It’s not an exact science of course, the amount of energy consumed depends on many variables but you get the picture.
Unless controls or regulations are implemented, these changes will continue. You will not be surprised to learn that there are solutions out there. Heat can be captured and re-used, diverted into projects such as greenhouses, leisure centres, housing projects. But this means further investment, and while there is no impetus to spend more than is necessary on a data centre, such projects can reap rewards.
Using liquid cooling for microchips has been in existence for many years on a smaller scale. Often collocated in air cooled data centres, servers are stripped back and placed in a dielectric fluid, within a sealed unit where they are kept at a constant temperature. The fluid can be cycled through a heat exchange and used for external projects, or simply recycled back into the units.
The benefits of liquid cooling are well documented, yet the technology is still being developed. Less energy is consumed, the longevity is improved, fewer faults occur, and the cooling system works well with heat exchanges, generating in some situations enough heat to maintain the constant temperature of a swimming pool, saving you, the end user money in the long run. Testing is still underway for different types of hardware that can be submerged, but there are clearly many future possibilities. For a sustainable alternative to air-cooled data centres, the liquid cooling is a huge step in the right direction for the environment.