<img height="1" width="1" style="display:none;" alt="" src="https://px.ads.linkedin.com/collect/?pid=1448210&amp;fmt=gif">
Data Centers to Double Their Energy Consumption, Driven by AI and Cloud Computing

Data Centers to Double Their Energy Consumption, Driven by AI and Cloud Computing

March 11, 2025

Electricity consumption in data centers poses one of the biggest decarbonization challenges of our time. On one hand, there is a huge demand for energy-guzzling applications within the Generative Artificial Intelligence (Gen AI) and cloud computing domains. On the other hand, these technologies require an unprecedented amount of power to run. For example, some industry gurus have suggested that a single Large Language Model (LLM) interaction could have similar electricity consumption as leaving a low-brightness LED lightbulb on for an hour. Though data center electricity needs account for just 2% to 4% of total power in advanced economies today, digital technology growth will amplify their grid impact by 2030. 

As more and larger data centers are constructed around the world to scale digital transformation projects, grid instability will only be exacerbated. ABI Research predicts the number of public data centers worldwide will quadruple by 2030, driving up data center energy demand significantly. With each new data center built comes a larger carbon footprint and the urgent need for thermal management solutions.

 

How Much Energy Do Data Centers Consume?

ABI Research’s latest analysis shows power consumption of data centers will more than double from 683 TWh in 2024 to 1,479 TWh by 2030. This represents a Compound Annual Growth Rate (CAGR) of 14%. Unsurprisingly, the technologically advanced U.S., Chinese, and European markets account for over half of global data center energy usage.

 


Data Center Energy Consumption by Region

chart-data-center-energy-consumption-forecast

(Source: ABI Research)


 

The shifting digital habits of both consumers and businesses are among the main factors influencing data center energy demand. Each Google search, crypto mining task, and ChatGPT query increases the energy footprint of data centers, fueling higher electricity usage. Enterprises are also rapidly investing in LLMs, which often require training and inferencing to be processed within data center infrastructure. With more computing power needed to run these applications, Information Technology (IT) equipment generates more heat. Cooling IT equipment currently drives 37% of data center power requirements, a major chunk of their energy use. Computing power accounts for another 42% of electricity use at a data center.

 


Data Center Energy Demand by Load (2025)

pie-chart-data-center-energy-demand-by-load.

(Source: ABI Research)


 

Hyperscale and AI Data Centers Raise Serious Energy Concerns

Hyperscale and Artificial Intelligence (AI) data centers are the backbone of cloud-based services, AI innovation, and High-Performance Computing (HPC). Hyperscale data centers—operated by major cloud providers like Amazon Web Services (AWS), Microsoft Azure, and Google Cloud—are growing rapidly. North America and Europe will continue to be data center hubs, but hyperscalers are showing significant interest in the Asia-Pacific and Middle East regions. For example, AWS recently committed to a US$6.2 billion investment to build data centers in Malaysia. Moreover, Google announced a US$1 billion investment to expand its data center cloud infrastructure to Thailand. Meanwhile, Saudi Arabia’s grand aspirations to become a manufacturing hub will attract hyperscalers to the region. These investments will only be replicated throughout the decade, making energy consumption a glaring issue.

 


Data Center Energy Consumption by Type

chart-energy-consumption-by-type

(Source: ABI Research)


 

Hyperscalers, operating large-scale facilities, demand massive electricity usage by data centers to power and cool server racks. ABI analysts project data center energy costs will rise as hyperscale facilities jump from 200 TWh in 2023 to 381 TWh by 2030. 

The demand for energy-efficient solutions has never been more urgent. To meet this challenge, hyperscalers are focusing on renewable energy and heat dissipation solutions. For example, data center operators are increasingly partnering with nuclear power plants to offset their carbon footprint. Heat reuse/recycling is also gaining traction, with many Western European data centers diverting generated heat to local homes and offices. Such initiatives are critical for the top 20 hyperscalers to increase their low carbon energy consumption from 88% in 2023 to 100% by 2030. However, harnessing clean energy is just one aspect of creating a green data center.

Cooling systems are another essentiality as they mitigate the heat produced by high-performance workloads. Most data centers still rely on traditional air-cooled systems. However, this is changing as hybrid cooling technologies, such as adiabatic chillers and liquid cooling systems, are gaining traction. By 2030, these advanced cooling systems are expected to make up more than 55% of the market. These technologies are crucial for managing the higher thermal loads driven by Gen AI, Machine Learning (ML), and other demanding applications.

 


Number of Data Centers Connected to Heat Reuse

chart-data-centers-with-heat-reuse

(Source: ABI Research)


 

Dedicated AI data centers are also growing quickly. Although they are still in the early stages, their number is expected to roughly double from 604 in 2024 to 1,204 by 2030. These technologically advanced facilities, often part of hyperscale data center campuses, focus on the intense computing requirements of AI workloads. However, AI’s increasing demand for computational power places additional strain on the electric grid and IT infrastructure. Estimates suggest 10% to 20% of the power draw of data centers comes from AI applications, straining the grid further.

Both hyperscale and AI data centers must continue to innovate. The growing need for data storage and processing power, driven by AI and high-performance applications, requires data centers to balance efficiency with environmental responsibility. The future of the industry will depend on how effectively it can manage energy consumption, while continuing to scale and support the next generation of computing.

 

Confronting the Data Center Energy Challenge with Cooling Technologies

Mitigating the environmental impact of data center expansion requires operators to deploy thermal management solutions, such as air and liquid cooling. Companies like Google, Intel, Iceotope, Microsoft, NVIDIA, and Schneider Electric have been testing these cooling techniques to improve Power Usage Effectiveness (PUE) scores. For example, Google employs a combination of water-based and air-based cooling technologies in its data centers. These systems are selected based on sustainable assessments of local watersheds to ensure environmental responsibility. In tandem with renewables, Google has achieved better-than-average PUE scores across its fleet of data centers.

To take a deep dive into the most promising data center cooling technologies that are proven to reduce power usage, download ABI Research’s whitepaper, Accelerating Data Center Efficiency with Advanced Cooling.

DC Cooling Social Card copy

 


Get tech intelligence delivered to your inbox 

Subscribe to ABI Research's weekly newsletter for access to our latest Analyst Insights, whitepapers, report samples, webinars and events, blog posts, and more. 


 

Tags: Smart Buildings, AI & Machine Learning