Ignorer et passer au contenu
Why Overheating Data Centres Are a Big Deal for the Future of Computing and AI

Why Overheating Data Centres Are a Big Deal for the Future of Computing and AI

Hot climates, hotter data centres

Modern computing relies on vast data centres to power everything from cloud gaming and AI to everyday web services. But there is a growing problem: many of these facilities are being built in places that are simply too hot for the hardware to run efficiently.

There are around 9,000 data centres currently in operation worldwide. According to recent analysis, the ideal temperature range for a data centre sits between 18 and 27 degrees Celsius, or roughly 64 to 81 degrees Fahrenheit. This is the sweet spot where servers and network equipment can run reliably without demanding extreme amounts of cooling.

The reality looks very different. Out of 8,808 tracked data centres, about 7,000 are located in regions where the climate often falls outside that optimal range. Even more concerning, around 600 of them operate in areas that regularly go beyond 27 degrees. That means thousands of server racks are constantly fighting against the ambient heat just to stay functional.

Keeping all that hardware cool already costs a lot of energy. Data centres collectively consumed around 415 terawatt hours of electricity in 2024, roughly 1.5 percent of the entire planet’s electricity use. When those facilities sit in hot climates, their cooling systems have to work much harder, increasing this power demand even further and putting more stress on local electrical grids.

This is not just a comfort issue. Overheating can shorten component lifespan, reduce performance, and cause outages and throttling. For anyone relying on cloud gaming, AI workloads, or online applications, that translates into higher latency, instability, and increased costs down the line.

Cooling tech is changing fast

Most data centres today still rely on air cooling. Massive air conditioning systems push chilled air through server aisles, pulling heat away from densely packed CPUs, GPUs, memory, and storage. That method works but it is inefficient in hot and humid environments.

Researchers and industry experts are now pushing an aggressive shift toward more efficient cooling technologies. In Singapore, which has about 1.4 gigawatts of data centre capacity and average temperatures around 33 degrees Celsius, engineers are testing new approaches specifically designed for tropical conditions.

One of the main upgrades being explored is direct to chip cooling. Instead of just blowing cold air over racks, this method uses liquid that flows through cold plates attached directly to hot components like processors and accelerators. Because liquids carry heat far more effectively than air, they can remove thermal energy faster and with less total power use.

Alongside that, immersion cooling is gaining traction. In this setup, entire server boards are submerged in special non conductive fluids. Heat moves straight into the liquid, which can then be pumped through heat exchangers far more efficiently than air based systems. These two techniques combined could cut data centre energy use by as much as 40 percent in some designs.

Experts expect that direct to chip cooling and immersion cooling will not remain niche technologies for long. Within about five years, they are likely to become standard features in many new build data centres, especially those focused on power hungry AI and GPU clusters.

There is also growing interest in using natural water sources for cooling at scale. Large seawater cooled facilities are already being explored. One example is an offshore data and AI complex near Hainan Island in China, often compared in power to tens of thousands of high end gaming PCs. Concepts like this move heat exchange into the ocean environment, which can help reduce reliance on traditional chillers and compressors on land.

The challenge is that these innovations mostly benefit new builds. Thousands of existing data centres risk being left behind unless they undergo expensive retrofits. Many older facilities were never designed for liquid cooling, high density GPU racks, or alternative heat rejection systems. As AI and cloud demand ramps up, these legacy sites may become increasingly inefficient or even obsolete.

The power problem behind the heat

Cooling is only half of the equation. All of this hardware needs electricity in the first place, and AI is pushing power requirements into new territory. Recent projections suggest that AI related power demand could quadruple over the next few years.

To keep up, some companies are experimenting with unconventional power sources. One AI data centre is already running on a hydrogen fuel cell, offering zero water usage and zero direct emissions. It shows that off grid, cleaner power solutions are possible in practice, not just in theory. But scaling this model is difficult because hydrogen infrastructure is nowhere near as mature as traditional gas or electricity networks.

Renewable energy such as solar and wind can help, but they come with their own constraints. Data centres need reliable, round the clock power. To make intermittent renewables work for such a consistent, heavy load, operators would need massive and costly battery storage systems. That is technically feasible but often not economically attractive for huge facilities.

As a result, the industry is also looking closely at nuclear power as a stable long term option for running cloud and AI infrastructure. There are already plans to restart some older nuclear plants specifically to support major tech companies and their data centre clusters. Nuclear offers dense, steady output with low carbon emissions during operation, but it brings political, safety, and cost debates that are not easy to resolve.

When you add together hotter climates, rising cooling needs, explosive AI growth, and the difficulty of scaling clean power, the numbers become hard to balance. New cooling methods, smarter site selection, and alternative power sources will all be needed if the infrastructure behind gaming, cloud PCs, and AI is going to keep up without overheating both the hardware and the planet.

Original article and image: https://www.pcgamer.com/software/ai/80-percent-of-the-worlds-data-centres-have-been-built-in-places-either-too-hot-or-too-cold-for-the-hardware-inside/

Panier 0

Votre carte est actuellement vide.

Commencer à magasiner