- Apr 30, 2026
The rapidly increasing global use of Artificial Intelligence (AI) is raising concerns about its substantial electricity consumption. Efforts are now focused on advanced cooling systems, more efficient computer chips, and smarter programming to reduce the AI industry's massive power requirements.
According to the International Energy Agency (IEA), AI is entirely dependent on data centers, which could consume 3% of the world's total electricity by 2030, nearly double current usage. Experts from US consulting firm McKinsey describe a race to build enough data centers to keep pace with AI's rapid growth, while also warning that the world is heading towards an electricity deficit.
A McKinsey report states that data centers collectively will account for about 8% of the total electricity demand in the United States in 2025, and this could rise to 18% of total domestic demand by 2030. According to the IEA's 'Energy and AI' report, global data center electricity demand will reach 945 terawatt-hours (TWh) by 2030, which is more than Japan's current total electricity consumption. Last year, data centers were responsible for about 1.5% (415 TWh) of global total electricity consumption, with the US accounting for 45% and China 25%.
Mosharraf Chowdhury, a computer science professor at the University of Michigan, explains that there are several ways to solve this problem. Companies can either generate more electricity – which is time-consuming and AI giants are already seeking it globally – or find ways to use less energy for the same computing power.
Chowdhury believes this challenge can be met with "clever" solutions at every level, from hardware to AI software. For example, his lab has developed algorithms that calculate precisely how much electricity each AI chip needs, potentially reducing energy consumption by 20-30%.
Gareth Williams of the consulting firm Arup states that twenty years ago, operating a data center – including cooling systems and other infrastructure – required roughly the same amount of power as running the servers themselves. Today, operations consume only 10% of the energy used by the servers.
This has been largely possible due to a focus on energy efficiency. Many data centers now use AI-powered sensors to control temperatures in specific zones, instead of uniformly cooling the entire building. According to McKinsey's Pankaj Sachdeva, this helps them optimize water and electricity usage in real time.
For many, liquid cooling will be a game-changer. It replaces the roar of energy-hungry air conditioners with a coolant that flows directly through the servers. Williams notes, "All the big players are working on it." This is crucial because modern AI chips from companies like Nvidia consume 100 times more power than servers from two decades ago. Some modern AI GPUs can consume up to 3.7 megawatt-hours (MWh) per year.
Amazon's world-leading cloud computing business, AWS, announced last week that they have developed their own liquid cooling method to cool Nvidia GPUs in their servers – which will help avoid reconfiguring existing data centers. Dave Brown, Vice President of Compute and Machine Learning Services at AWS, stated in a YouTube video, "We wouldn't have enough liquid-cooling capacity for our scale."
A hopeful sign for McKinsey's Sachdeva is that each new generation of computer chips is more energy-efficient than the last. Research by E Ding of Purdue University has shown that AI chips can last longer without losing performance. However, Ding adds, "It is difficult to persuade semiconductor companies to earn less money by encouraging customers to use the same equipment for a long time."
While greater efficiency in chips and energy use might make AI cheaper, it will not reduce total energy consumption. Ding predicts, "Energy usage will continue to increase," despite all efforts to limit it. "But perhaps not as quickly."
In the United States, energy is now considered crucial for maintaining the country's competitive advantage over China in AI.
In January, Chinese startup Deepseek unveiled an AI model that performed comparably to top US systems using less powerful chips – and consequently consumed less energy. Deepseek's engineers achieved this by more precisely programming their GPUs and skipping an energy-intensive training step previously deemed essential.
There are concerns that China is far ahead of the United States in terms of available energy sources, including renewable and nuclear power. China has set a goal to power over 80% of its data centers with clean energy by 2030.
Source: The Daily Star