The AI growth is pushing Europe between power-hungry information facilities and environmental objectives
A large hallway with supercomputers in a server room data center.
Luza Studios | E+ | Getty Images
The artificial intelligence boom is driving an environmentally conscious shift in the way data centers operate, as European developers face pressure to lower the water temperatures of their power-hungry facilities to accommodate the more powerful chips from companies like the tech giant Nvidia.
Research from Goldman Sachs shows that AI is expected to lead to a 160% increase in demand for data centers by 2030 – an increase that could come at the expense of Europe's decarbonization goals, as the specialized chips used by AI firms are expected to increase energy usage the data centers that provide them.
High-performance chips – also called graphics processing units or GPUs – are essential for training and deploying large language models, which are a type of AI. These GPUs require high processing power and produce more heat, which ultimately requires colder water to support reliable cooling of the chips.
According to Andrey Korolenko, Chief Product and Infrastructure Officer at Nebius, who specifically referred to using the Blackwell GB200- Chips from Nvidia.
“It’s extremely dense, and from a cooling perspective you need other solutions,” he said.
“The problem we have with chipmakers is that AI is now a space race driven by the American market, where land rights, energy access and sustainability are relatively low on the list and where market dominance is key,” Winterson told CNBC
Michael Winterson
Chairman of the EUDCA
Michael Winterson, chairman of the European Data Center Association (EUDCA), warned that falling water temperatures “will ultimately fundamentally return us to an untenable situation that we found ourselves in 25 years ago.”
“The problem we have with the chip manufacturers is [that] “AI is now a space race being waged by the American market, where land rights, energy access and sustainability are relatively low on the list and where market dominance is key,” Winterson told CNBC.
Major equipment suppliers in Europe say U.S. chip designers are asking them to lower their water temperatures to accommodate hotter AI chips, according to Herbert Radlinger, managing director of NDC-GARBE.
“This is shocking news because originally all engineers expected that they would choose liquid cooling to achieve higher temperatures,” he told CNBC, referring to liquid cooling technology that is said to be more efficient than the more traditional air cooling method .
“Evolution discussion”
Energy efficiency is high on the European Commission's agenda as it aims to meet its target of reducing energy consumption by 11.7% by 2030. The EU predicted in 2018 that data center energy consumption could rise by 28% by 2030, but the emergence of AI is expected to double or triple that figure in some countries.
Winterson said lowering water temperatures was “fundamentally incompatible” with the EU's recently introduced Energy Efficiency Directive, which set up a special database for data centers of a certain size to publicly report on their electricity consumption. The EUDCA has been lobbying Brussels to take these sustainability concerns into account.
The energy management company Schneider Electric is in frequent contact with the EU on this issue. Much of the recent discussion has focused on various ways to procure “mainstream power” for AI data centers and the potential for greater collaboration with utilities, said Steven Carlini, chief AI and data center proponent and vice president at Schneider Electric.
European Commission energy officials also met with Nvidia to discuss energy consumption and data center usage in terms of the effectiveness of power consumption and chipsets.
CNBC has reached out to Nvidia and the commission for comment.
“Cooling is the second largest energy consumer in the data center after IT load,” Carlini told CNBC in emailed comments. “Energy consumption will increase, but PUE (Power Usage Effectiveness) may not increase at lower water temperatures, even though chillers will have to work harder.”
Schneider Electric customers using Nvidia's Blackwell GB200 superchip require water temperatures of 20 to 24 degrees Celsius, or between 68 and 75 degrees Fahrenheit, Carlini said.
He added that this compares to temperatures of around 32 degrees Celsius with liquid cooling, or the around 30 degrees Celsius that Meta has suggested for the water powering the hardware.
Ferhan Gunen, Vice President of Data Center Operations in the UK Equinixtold CNBC that there are a number of concerns about AI that Equinix has discussed with its customers.
“They want to increase the density of their servers, meaning they want higher power chips or more servers,” she said, adding that the shift is not “clear cut.”
“It's really an evolution discussion more than anything else,” Gunen said.
Nvidia declined to comment on the cooling requirements of its chips and announced a new platform for its Blackwell GPUs earlier this year. The architecture would enable companies to run generative AI in real-time on large language models at up to 25 times lower cost and energy consumption compared to previous technology.
Liquid cooling will require “reconfiguration,” Gunen explained, adding that new data centers are already equipped with this technology. “Yes, higher density means higher power consumption and also higher cooling requirements. But then technology changes, so you do it differently. So there’s a balance in all of this,” she said.

Race for efficiency
Nebius, which has about $2 billion in cash on its balance sheet after its spinoff from Russia's Yandex, has said it will be one of the first companies to bring Nvidia's Blackwell platform to customers in 2025. The company has also announced plans to invest more than $1 billion AI infrastructure in Europe by the middle of next year.
Nebius' Korolenko said liquid cooling is a “first step” where operating costs initially decrease before improving over time.
“There is a lot of pressure on delivery, but at the same time when you scale you want to have the ability to choose, be economical and not sacrifice too much. Energy efficiency is important for operating costs. This is always a high priority,” Korolenko said.
Even before a boom in demand for AI applications hit the market, the data center industry in Europe was struggling to keep up with the growing digital sector.
Sicco Boomsma, managing director of ING's TMT team, said that market participants are “very sensitive to electricity” and that while Europe's focus is on infrastructure, the US has been more focused on expanding assets in Europe, where electricity was available.
“There are a huge number of data center operators from the US coming together to ensure that their data center infrastructure is in line with the various EU goals, such as carbon neutrality and efficiency,” to water use, to biodiversity conservation. “
“It's a kind of race where they want to show that their knowledge leads to super-efficient infrastructure,” he said.
Comments are closed.