Nvidia-backed trial shows AI data centers can flexibly adjust power use in near real time, with global implications for energy consumption — suggests hyperscalers can reduce consumption as necessary, ensuring grid isn’t overloaded during peak demand
This is a win-win short-term solution for both AI hyperscalers and grid operators.
Get 3DTested's best news and in-depth reviews, straight to your inbox.
You are now subscribed
Your newsletter sign-up was successful
A U.K. Study run by the National Grid, Nvidia, Emerald AI, and the Electric Power Research Institute (EPRI) just showed that AI data centers do not need to run at peak demand at all times. According to Bloomberg, the report suggests that hyperscalers can quickly adjust their power consumption as needed, ensuring that the grid is not overloaded when other consumers increase demand, while also using excess power from renewable energy when demand falls. If AI tech companies agree to this arrangement, it could allow them to go online much sooner, especially as the lack of available grid power has been hampering the deployment of more AI GPUs.
The system tested is very flexible and doesn't require much advanced scheduling. Testing showed that it took less than a minute for a data center operator to reduce the amount of power it used to around 66%, and that one data center could even run at just 10% capacity for 10 hours.
This isn’t an ideal scenario for many hyperscalers, especially as they need as much power as they can get to maximize their expensive GPUs. However, insisting on this capability means that they’ll have to wait for the local grid to catch up (which can take years, if not decades), or they’ll have to deploy their own onsite generators (which can get expensive, unless they have deep pockets like OpenAI, and come with their own set of challenges).
This would be a win-win situation if both data centers and electricity grid operators could agree to run a system like this, as it would allow the former to expedite connecting their infrastructure to power while the latter could maximize their capacity even during off-peak hours. In fact, many utility companies already have a system that monitors spikes in demand and supercharges the grid as needed.
There is a phenomenon called “The Great British Kettle Surge,” wherein U.K. Power providers prepare the electricity grid for the expected surge during breaks on popular TV events like the World Cup. This happens because millions of households boil their kettles simultaneously during these breaks, resulting in a sudden peak in power demand for a few minutes.
Now, instead of pouring in more electricity from power plants, which we’re already in short supply of, the operators could just ask AI data centers, which are one of, if not the biggest, power consumers, to reduce their demand. It might take some time and maybe some push from regulators before this can be implemented, but it’s a good short-term solution that will allow data centers to go online as soon as possible without breaking the grid.
Follow 3DTested on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.
Get 3DTested's best news and in-depth reviews, straight to your inbox.

-
edzieba Industrial scale demand / load scheduling is nothing new, and has been going on for many decades in other industrial sectors.Reply
The main 'news' here is recognition that unlike most hyperscaler HPC and DC workloads, 'AI' training workloads can be scaled up and down pretty trivially so can be slotted into existing scheduling architecture.