Nvidia-backed trial shows AI data centers can flexibly adjust power use in near real time, with global implications for energy consumption — suggests hyperscalers can reduce consumption as necessary, ensuring grid isn’t overloaded during peak demand
Source: Tom’s Hardware

Image credit: Shutterstock
Study Overview
A U.K. study conducted by the National Grid, Nvidia, Emerald AI, and the Electric Power Research Institute (EPRI) found that AI data centers do not need to operate at peak demand continuously. According to Bloomberg, the report suggests that hyperscalers can quickly adjust their power consumption, preventing grid overload during peak periods while also taking advantage of excess renewable energy when demand drops.
If AI technology companies adopt this flexible approach, they could bring more capacity online sooner, mitigating the current bottleneck caused by limited grid power—a factor that has been hampering the deployment of additional AI GPUs.
Flexibility Findings
- The tested system is highly adaptable and requires minimal advanced scheduling.
- Data‑center operators were able to cut power usage to around 66 % in under a minute.
- One data center demonstrated the ability to run at 10 % capacity for 10 hours.
These results show that AI workloads can be throttled without extensive planning, offering a practical tool for grid management.
Implications for Hyperscalers
While many hyperscalers prefer to run at full power to maximize the utilization of expensive GPUs, insisting on constant peak consumption forces them to:
- Wait for grid upgrades, which can take years.
- Deploy onsite generators, a costly solution unless they have deep pockets (e.g., OpenAI’s gas‑turbine deployment).
- Navigate regulatory and permitting challenges associated with on‑site power generation.
By embracing demand‑flexibility, hyperscalers could avoid these hurdles and accelerate the rollout of AI infrastructure.
The “Great British Kettle Surge”
The United Kingdom experiences a phenomenon known as the “Great British Kettle Surge,” where power providers brace the grid for sudden spikes during TV event breaks (e.g., the World Cup) as millions of households boil kettles simultaneously.
Instead of increasing generation from already strained power plants, grid operators could request AI data centers—among the largest electricity consumers—to temporarily reduce demand. Although regulatory support and industry coordination are needed, this approach offers a short‑term solution that helps keep the grid stable while allowing data centers to come online more quickly.
