Data centers account for around four percent of total electricity consumption in Europe and North America. With the rapid expansion of artificial intelligence, their energy demand is expected to rise sharply in the coming years. This makes the question of how flexibly they can use power all the more urgent.
Data centers need power—around the clock, and in significant amounts. Globally, they account for roughly 2 percent of total electricity consumption. In Germany, the figure was 3.7 percent in 2022, according to the Borderstep Institute. In the U.S., it reached 4.4 percent in 2023, according to the Department of Energy. And the trend is upward.
Artificial intelligence is expected to play a key role in this development. On the one hand, the industry hopes that self-learning systems will help optimize their own energy use. For now, however, the prevailing belief is that the IT sector’s growing computing power will drive electricity demand sharply upward. Analysts are racing to outdo each other with bold predictions.
AI boom could cause data centers’ electricity use to skyrocket
According to a recent study by the Boston Consulting Group, the electrical energy consumption of data centers worldwide is expected to grow by 16 percent annually between 2023 and 2028 resulting in a doubling within the next four years to 125 gigawatts (GW). To put that into perspective: the average weekday peak load in France and Germany combined is around 150 GW. Goldman Sachs Research comes to a very similar conclusion.
Now, the curve could steepen even more in the United States. In January, newly elected President Donald Trump announced a $500 billion investment in AI infrastructure in the U.S. over the coming years. That same amount can be found in a McKinsey study published in September 2024. The analysts project that power consumption by U.S. data centers will quadruple between 2023 and 2030—from 148 to 606 terawatt-hours (TWh). That would exceed the current global electricity demand of all data centers and rival some forecasts for Germany’s total power consumption in 2030.
Most of the new builds are expected to be hyperscale data centers—facilities packed with servers, storage units, and above all, processors optimized for AI workloads. The added computing power also shows up in electricity demand: a typical AI processor like Nvidia’s A100 consumes about five times more power than a conventional CPU. Nvidia promotes the chip as being four times more energy-efficient for the same computing output. But in practice, that efficiency gain is unlikely to reduce power consumption—instead, it’s expected to fuel even greater use of AI.
There are also signs that it’s not just processors but also the machine-learning models themselves that could become significantly more energy-efficient. Early this year, Chinese company DeepSeek shook up the AI world. It developed a chatbot operating at the level of OpenAI’s ChatGPT, Microsoft’s Copilot, and Google’s Gemini—yet it consumes far less energy than its U.S. counterparts.
What’s moving faster, the expansion of AI infrastructure or gains in chip efficiency?
Altogether, it is by no means certain that the expansion of AI capacity will inevitably lead to a massively higher power demand from data centers within just a few years. That’s a point emphasized by Michael Terrell, Google’s Senior Director of Clean Energy and Carbon Reduction. He’s responsible for ensuring that the tech giant can fully run on renewable energy. Speaking on the podcast Redefining Energy, Terrell referenced a Science article noting that, while global computing power increased by 550 percent between 2010 and 2018, data center electricity consumption rose by just six percent over the same period.
“Data center operators are highly motivated to run these facilities as efficiently as possible,” Terrell said, expressing his skepticism toward forecasts of runaway energy demand, offering an example: “Google’s fourth-generation TPU (short for Tensor Processing Unit, used for training AI models) now is three times more energy-efficient than our version 3.”
For a company at the heart of internet capitalism, the focus certainly isn’t solely on energy efficiency—but also, perhaps primarily, on cost efficiency. Yet in the world of data centers, the two are closely tied.
“Electricity is by far the largest ongoing expense for datacenter operators,” notes a report by the International Data Corporation (IDC), “accounting for 46 percent of total spending for enterprise datacenters and 60 percent for service provider datacenters.”
“Electricity is by far the largest ongoing expense for datacenter operators, accounting for 46 percent of total spending for enterprise datacenters and 60 percent for service provider datacenters.”
How much can data centers contribute to stabilizing power grids?
High energy costs don’t change the fact that data centers operate most profitably when running at full capacity.
According to estimates by Goldman Sachs Research, data centers currently operate at approximately 85 to 90 percent of their capacity. In theory, that leaves some room to maneuver—potentially helping to avoid peak loads and increase the share of renewable energy in the power mix.
This would also have a positive impact on the power grids themselves, which regularly come under stress during peak demand hours. Even a small degree of flexibility could significantly ease the burden, according to researchers at Duke University in North Carolina. In a recent paper, they explain that U.S. power grids could accommodate new facilities with a combined consumption of 100 GW with relative ease—if electricity load were curtailed by just 0.5 percent over the course of a year.
How can data center power consumption be made more flexible?
A key factor in enabling data centers to power down during peak demand periods and adapt to the availability of renewable energy is timing—specifically, when they reduce their load. But how exactly could that work?
Shifting Compute Loads
Google expert Michael Terrell talked about shifting computing loads—both in time and across geographic locations. Some tasks, he explained, are less sensitive to latency than others, such as data backups or processing YouTube videos and photos. “We found that we could actually delay those jobs or shift those jobs and not affect the services that we are offering to customers,” Terrell said. “We started shifting our compute loads, scheduling our compute jobs to align with the times of the day when the grids were the cleanest”.
Training AI models could also be one of those more flexible tasks regarding timing. At least in theory, the potential impact is significant, as training is estimated to account for nearly two-thirds of the total energy consumption associated with AI computing.
The idea of temporal flexibility translates to geographical flexibility by shifting computing power from California to the U.S. East Coast, for instance, during nights when winds are stronger over the Atlantic than the Pacific.
”Now we’ve developed the capability to actually shift those loads from location to location,” Terrell said. Still, he sees room for improvement—particularly in weather forecasting. Looking ahead, he envisions a system in which workloads are shifted around the globe to wherever the sun happens to be shining. At the same time, Google is also exploring energy sources like geothermal power and long-duration storage technologies such as hydrogen.
Bitcoin and other cryptocurrency mining centers can also shift computing loads—and doing so has actually become more attractive recently. That’s because the energy required to mine a single Bitcoin rose significantly over the past year. In early February 2025, it hit an all-time high of more than 1,300 megawatt-hours per BTC. Even at a Bitcoin price of €80,000, the cost per megawatt-hour would have to stay below €61.50 for mining to remain profitable. This threshold is regularly exceeded—even in countries like Norway and Denmark, which have become popular locations for mining centers due to relatively low electricity prices. Still, one thing is clear: the higher the price of Bitcoin climbs, the less incentive there is to shift workloads.

Quelle: Shehabi (2016) via AGCI
A publicly funded research project by the U.S. National Renewable Energy Laboratory (NREL) is exploring the use of underground thermal energy storage (UTES) systems to reduce power consumption in data centers during peak demand periods. As early as 2020, researchers at Cornell University found that chilled water storage systems could help lower the operating costs of data centers. Their calculations showed that smart load shifting over a two-day period could reduce electricity costs by 8.8 percent.
Expanding Battery Storage
To ensure uninterrupted power supply, guarantee 24/7 operations, and prevent data loss, data centers are already equipped with backup generators—and increasingly, with battery storage systems.
Expanding electricity storage would be another way to make data centers more flexible in how they draw power from the grid—allowing them to better align consumption with the availability of renewable energy and, in turn, lower electricity prices. For instance, batteries could be charged with inexpensive solar power during midday hours. That stored energy could then supply the data center later in the afternoon, when electricity typically becomes more expensive—a pattern that, in Germany and most other countries, usually corresponds to a drop in the share of renewables in the grid.
Behind-the-Meter Power Generation
Another option is to equip data centers with their own behind-the-meter power plants. This means, for instance, installing rooftop solar panels that supply the data center directly with electricity as a first priority.
Any surplus power could be used to charge a directly connected battery storage system. Once the battery is fully loaded, excess energy could be fed into the grid.
Large data centers, some of which reach a power consumption of 100 MW or more, are set to be outfitted with on-site gas-fired power plants. Google, meanwhile, plans to build so-called SMRs (Small Modular Reactors) — compact, flexible nuclear power units—to help bridge gaps in renewable energy supply in the coming years. Whether these reactors will be connected via the grid or behind the meter remains undecided.
Such an approach naturally requires substantial additional investment. Proponents, however, argue that it can actually make it easier to get new data centers approved—since it reduces the strain on local power grids. That’s the case made by project developer Sheldon Kimber in the Catalyst podcast. The logic: these systems can help relieve pressure on the grid, especially during peak daytime hours, because the required electricity is generated and consumed behind the meter—outside the public grid.
What’s the situation for data centers in Germany?
Germany has the largest data center capacity in Europe, followed by the United Kingdom and France. For operators in Germany, the question of how to align their facilities with sustainable energy sources is particularly pressing. Since the beginning of 2024, they’ve been subject to new requirements under the Energy Efficiency Act, which mandates that at least 50 percent of their electricity—at least on paper—must come from renewable sources. By 2027, that share must rise to 100 percent.
That doesn’t mean every watt must physically come from a solar or wind farm—which is barely feasible in practice. But operators are required to prove that the amount of electric energy they consume has been sourced from sustainable generation.
In the latest study ”Data Centers in Germany” by the German IT industry association Bitkom, 78 companies were asked what steps they’ve taken to reduce emissions at their data centers. Three-quarters of them reported having signed green power contracts. While this is certainly the most convenient option, it’s not necessarily the best value.

Power Procurement for Data Centers
Let’s summarize the starting point for power procurement in a data center: high electricity demand meets a growing potential to make consumption more flexible. As energy traders, we recommend a dynamic supply strategy—one that combines multiple modules to effectively address this specific situation.
The first building block should involve locking in electricity prices for a portion of the data center’s consumption. For example, around 70 percent of the load can be covered through the purchase of solar and wind profiles on the PPA market, with prices fixed for a period of one to three years.
The next building block is covering the remaining 30 percent of total electricity needs through short-term power trading—known as spot market exposure. This portion is not price-fixed; instead, it represents a flexible cost component. But how can a data center hedge the inherent risks of this flexibility without giving up the opportunities that come with it—especially considering that spot market prices can move in either direction?
We optimize your power procurement
Our expertise in optimizing and trading consumption profiles, combined with the longest trading experience on German short-term markets and our PPA platform PowerMatch, offers you dynamic power procurement.
This presents two options that, ideally, can be combined. First, a data center can leverage its flexibility in power consumption—buying electricity only during low-price periods and aligning usage accordingly. Second, a virtual battery—essentially a PPA with a battery operator — can be used to hedge the non-fixed portion of electricity demand. If this backup power isn’t needed—because the center’s own flexibility allows it to procure residual volumes cheaply on the spot market—it can instead be sold at a profit during high-price periods.
Flexibility Pays Off
Data centers are most likely to benefit from price advantages when they align their electricity consumption with the availability of renewable energy. That’s because power—regardless of the generation source—is typically cheapest during those times. In other words: data centers that manage to shift most of their demand to sunny midday hours or windy nights will see significantly lower electricity costs than those operating at full load on sunless winter afternoons.
How do German data centers plan to make their electricity consumption more flexible?
In an expert survey conducted for the Bitkom study, 52 percent of respondents saw the expansion of renewables and greater flexibility as an opportunity for the industry. When asked which technologies hold the greatest potential for making grid electricity consumption more flexible, most of the 110 experts pointed to thermal energy storage. Just over 40 percent expect hydrogen and synthetic fuels (e-fuels) to play a role as energy sources for backup generators. Two out of five cited lithium-ion batteries, while just over half as many mentioned sodium-based batteries. Interestingly, the possibility of making computing loads more flexible was not mentioned at all.
