A technician works at an Amazon Web Services AI data centre in New Carlisle, Indiana, US. — Reuters
WASHINGTON, United States: The expansion of data centres to power the AI boom has more people wondering: what exactly is in a data centre?
AFP got a chance to take a look at what is inside.
Concrete warehouse
Data centres are the physical infrastructure that make our digital lives possible, yet most people have never seen one up close or understand how they operate.
Roughly 12,000 data centres are in operation in the world, with about half in the US, according to Cloudscene, a data centre directory.
At its most basic, a data centre is a concrete warehouse filled with thousands of computer servers working in tandem. Traditional facilities span one or two floors divided into vast rooms, though newer ones rise higher.
A facility may serve a single company or be shared by several clients.
The servers sit in standardised 19in (48cm) racks – essentially metal closets lined up in rows.
A large data centre can house tens of thousands of servers running simultaneously, generating enormous heat and consuming significant energy for both power and cooling.
High-speed networking equipment – switches, routers, and fiber optic cables – connects everything, moving terabytes of data per second.
Stay close
Having a data centre close to end users improves speed, which is critical for things like trading and gaming where immediacy is paramount.
Ashburn, Virginia, which has the highest concentration of data centres in the world, offers ideal conditions as it is located only about 30 miles from the US capital, Washington.
However, building in densely populated areas costs more and faces local resistance. Companies increasingly turn to rural locations where land is cheaper and zoning less restrictive.
But distance adds to loading times – that brief delay when a page loads or a feed refreshes.
To balance cost and performance, operators typically house core infrastructure – or the training of AI models – in affordable rural regions while keeping equipment that handles time-sensitive requests closer to urban centres.
Stay cool
Inside these bunker-like buildings, a single server rack generates as much heat as several household ovens running nonstop. Cooling consumes roughly 40 percent of a data centre's total energy.
The most advanced chips – GPUs (graphics processing units) used for AI – can reach temperatures exceeding 90°C, threatening performance and causing permanent damage during extended operation. They are also much heavier than lower performing chips.
Traditional facilities use computer room air conditioners with heat blasting out of mounted vents on on rooftops – but this is not fit for GPUs that mainly turn to water for cooling.
Modern facilities are beginning to deploy "free cooling" that uses outside air when temperatures allow, and different water-based approaches: liquid cooling systems that pump coolant directly to components or evaporative cooling that works like perspiration on skin.
Today massive amounts of water are still required for direct and indirect cooling in data centres. In 2014, US data centres used 21.2 billion liters of water, and that number rose to 66 billion liters in 2023, according to federal estimates.
Where's the power?
Power supply – and the high voltage transmission lines needed to source it – is key for a data centre and is only growing with facilities that run the powerful GPUs.
"One of the biggest challenges for a lot of our customers is they buy the chips and then they don't know where to go," Chris Sharp, Chief Technology Officer at Digital Realty, which operates data centres around the world, told AFP.
The big tech giants, caught up in the AI arms race, have spent tens of billions of dollars in just months towards building suitable structures for GPUs.
Operators rely on the existing power grid but are increasingly seeking to secure their own resources – called "behind-the-meter" – for greater security and to limit rate increases for all users.
Solar panels or gas turbines are sometimes installed, and many are also awaiting the arrival of the first small modular reactors (SMRs), a nuclear energy technology currently under development.
Most data centres have to run 24/7 and every critical system has backups in case of power outages. This can come through massive battery banks or diesel generators.
The best facilities guarantee power 99.995% of the time. – AFP
