17 February 2026
/ 17.02.2026

Data centers as new network flexibility factories

The future of energy and the future of artificial intelligence intersect in the same infrastructure, a network that thinks, learns, and self-regulates. Because intelligence, in the end, is not just in the data we process, but in the way we decide to feed it

Data centers have moved out of computer architecture textbooks and onto the energy transition agenda. In the United States, their consumption has risen in less than a decade from about 58 to 176 TWh: more than 4 percent of national electricity demand, with scenarios talking about a possible jump to 6-12 percent by 2030. This is not an occasional wave; it is a strengthening current. The engine has a name that is now ubiquitous: artificial intelligence.

The promise of AI has an energy-consuming downside. The heart of data centers-servers, GPUs, switches-never sleeps: it stays on around the clock. For networks, this means a constant load base, a regular and predictable breath of air, capable of providing stability but also imposing relentless demand. What changes, throughout the day, are the support systems that keep that infrastructure alive: cooling, ventilation, lighting. It is they that carve the hourly curve, with peaks chasing the hottest hours and putting pressure on the grid just when energy costs the most.

The key word is flexibility

Then the AI shuffles the cards one more time. GPUs work in cycles alternating training and inference; the load stops being flat and becomes pulsing. It is more complex to manage, but also potentially more elastic. Here the United States serves as a laboratory: flexibility is the key word. The Rocky Mountain Institute distinguishes three levers that data centers can activate. Temporal flexibility, which shifts over time what is not urgent-a compute batch, a cooling cycle-to times when power is most abundant. Spatial flexibility, which redistributes load among different locations following local renewable availability or grid capacity. And storage and generation flexibility, which integrates advanced UPS, batteries, and microgrids to reduce demand at critical times and provide grid services.

In this scheme, data centers stop being just users and become assets. The Electric Power Research Institute sees them as grid assets; the Department of Energy talks about grid-interactive architectures, where IT systems talk to the power system in real time. It’s not theory: an MIT/CEEPR study estimates that well-integrated flexibility can reduce overall system costs by about 3 to 4 percent. Small numbers, some will say; but in an industry where every percentage point is worth billions, the impact is far from negligible.

However, there is a caveat that cannot be evaded. Flexibility is a tool, not an outcome per se. In markets still fueled by coal or gas, shifting loads can lower costs while simultaneously increasing emissions. This is the paradox of the digital transition: the power of cloud and AI requires governance that can direct flexibility where it creates real environmental value, starting with a gradually more decarbonized mix.

Doubling in sight

The global scenario confirms the trajectory. The International Energy Agency predicts that data center consumption could double by 2030, exceeding 900 TWh. What we see today in the United States is a foretaste: Europe, Canada, China, India, and Southeast Asia are moving in the same direction. Everywhere, the growth of cloud, AI and edge computing will bring similar issues: grid balance, available capacity, price stability, climate footprint. The difference will be the quality of integration: whether countries can turn flexibility into a structural response, or leave it on the sidelines as a tactical exception.

In Europe, the Data Centre Code of Conduct and new local flexibility markets open up the active participation of these actors. In Italy, the evolution of MACSE and demand response mechanisms can facilitate the same process, stitching together energy and digital transition into a single industrial supply chain. The regulatory push toward 24/7 carbon-free energy models and smarter use of distributed storage suggest a clear path: data centers not as a problem, but as part of the solution.

Meanwhile, other poles are moving. In Canada, the race for AI data centers is saturating urban networks and forcing infrastructure accelerations. In China, regional computing zones generate intermittent loads that must be governed. In India, public cloud expansion is forcing quick choices on capacity and reliability. And in Europe, regulatory pressure results in experiments that aim to make flexibility models replicable.

Bringing two revolutions into dialogue

The point of convergence is clear: the future of energy and the future of artificial intelligence intersect in the same infrastructure, a network that thinks, learns, and self-regulates. Because intelligence, in the end, is not just in the data we process, but in the way we decide to power it. If flexibility is put at the service of a cleaner, more reliable, more cost-effective grid, data centers will become the new balance factories of the electricity system. If it remains a set of isolated arrangements, we will have missed the opportunity to bring into dialogue two revolutions-the digital and the energy revolutions-that, together, can change the way our cities, our industries, our daily lives work.

Reviewed and language edited by Stefano Cisternino
SHARE

continue reading