Google agrees to pause AI workloads when power demand spikes
48 points
10 hours ago
| 9 comments
| theregister.com
| HN
acc_297
9 hours ago
[-]
This is a common agreement to have with industrial power users. I know in Quebec during the coldest days in winter industrial users are required to scale back.

I would hope there aren't too many large utility jurisdictions which would curtail citizen consumers in favour of industrial users in the event of a demand surge.

On a related note. It's worrying to me how quickly we've accepted that we're going to boost electricity consumption massively prior to achieving anything close to the carbon intensity reduction targets which would mitigate the worst of climate change effects. It's all driven by a market force which cannot be effectively regulated on a global scale for multinational tech firms who can shop around for the next data centre location with near total freedom. And with advances in over the top fibre networks etc... a tonne of AI demand can be met by a compute cluster on the other side of the world (especially during model training) so the externalities related to the computing infrastructure can theoretically be completely dumped somewhere far away from the paying customer.

reply
everdrive
8 hours ago
[-]
It's extremely important to our future that people can vomit out fake movie trailers, and that CEOs can hire fewer people. This is just a cost we need to bear. For sure, any problems introduced by _this_ complex technology will just be solved by future and even-more-complex technologies.
reply
quirk
9 hours ago
[-]
reply
jeffbee
9 hours ago
[-]
But electricity consumption is a minority of energy consumption. It was always true that decarbonization was going to massively increase the use of electricity.
reply
schiffern
1 hour ago
[-]

  >But electricity consumption is a minority of energy consumption.
Everything is a minority of energy consumption.

https://commons.wikimedia.org/wiki/File:US_energy_consumptio...

If your standard is "I won't do anything unless it's the majority of energy consumption," you're really just saying don't do anything period.

  >It was always true that decarbonization was going to massively increase the use of electricity.
That was the plan (and it could have worked too), but what actually happened is the new decarbonized energy is supplementing fossil fuel instead of replacing it.
reply
acc_297
8 hours ago
[-]
Yes but the task becomes that much harder - we are scaling up natural gas generation not to phase out coal but simply to meet demand that wouldn't exist without the fierce competition to build the biggest LLM. Any feasible plan made 5 years ago which may have worked to transition a large industry from fuel burning energy sources to electricity generation (renewable or otherwise) is made 10x harder by the introduction of this rapid rollout in datacentre capacity.
reply
jeffbee
8 hours ago
[-]
I'm not sure if that's true of not. As the article indicates, training for AI is naturally demand responsive. Training can move on the clock, and it can move around the world to minimize carbon footprint. See the PDF I just love to share: "Revamped Happy CO2e Paper" https://arxiv.org/pdf/2204.05149
reply
acc_297
7 hours ago
[-]
Agree on training. But that google paper was written when the only image model available for broad public consumption was dall-e 2 and video models were more than a year away. It gets a mention in a more recent 2024 paper [1] which goes into detail about how inference rather than training creates the difficult to manage energy load which grids struggle to meet. If consumer interests and demands drive the trend in what companies offer in terms of inference capability then it's fair to worry that the impact on sustainability goals will be an afterthought.

[1] https://dl.acm.org/doi/pdf/10.1145/3630106.3658542

reply
jeffbee
7 hours ago
[-]
The second author of that paper is the person who got turfed out of Google for refusing to use actual energy consumption and insisting on using their flagrantly wrong estimate of inference energy costs. E.g. the rebuttal by Dean https://x.com/JeffDean/status/1843493504347189746
reply
acc_297
6 hours ago
[-]
The number was correct to a reasonable degree under the assumptions stated by the author in the paper that tweet references since they obtained estimates from consumer grade hardware and the carbon intensity associated with average kilowatt produced in the United States not a hyperscale datacentre run using "ML best practices" although this distinction is left out of various lay media citations. The number also did not pertain to inference it was associated with training a particular model from pre-2019.
reply
kerblang
8 hours ago
[-]
> a process Google calls “demand response”

A process everybody else calls "demand response" as well. It's a longstanding practice between power utilities and big customers.

For the little people, in some hot cities you can agree to use a utility-supplied thermostat that automatically curtails your AC during peak load and get a discount in exchange.

reply
Pet_Ant
9 hours ago
[-]
If AI training becomes something that only happens with surplus power and takes the place of BitCoin mining non-sense that could really be a net positive.
reply
im3w1l
8 hours ago
[-]
Maybe it could happen partially, but cryptomining has the advantage that it can adapt to the power available and barely needs any IO, so if there is a single oil well in the middle of nowhere you could use the natural gas for mining instead of flaring it.

I think AI wants large centralized data center with really good internal network.

reply
iamgopal
8 hours ago
[-]
Are there any research in this area ? Crypto and particularly bitcoin mining has massive capacity for computation , albeit at lower memory scale, if AI memory model could be encode in to blockchain, we can get benefit from bitcoin mining.
reply
tossandthrow
8 hours ago
[-]
If you find this interesting, there is a small body of research on the intersection of federated learning and block chains - but definitely not anything. Interesting yet.
reply
HocusLocus
5 hours ago
[-]
AI [companies] are starting to publicly frame AI concerns around grid power demand. They think this broadcasts them as "good corporate stewards" among people concerned about grid energy demand and 'climate change' and 'AI carbonara' in general.

In North America, the US has committed to natural gas turbines and petroleum forever. Not because of any good and evil stooges in politics... but because they have rejected the energy density of nuclear power for whatever reasons decades ago. which was then and now ONLY other comparable energy source on the table.

It will backfire. AI people (like Musk who went big sensibly with natural gas turbines in Memphis)is being targeted and chased because of them. And AI is NOT the (direct, local) job creator it was promised to be.

What it's actually doing is conflating the two. People will react to turbine fumes from AI plants today, to a greater extent than they would even react to a natural gas electric power plant on the same property.

So AI plants will be chased away from cities, relocate near existing power plants, then they will be attacked again by people forcing them to buy 'carbon credits' directly. Most cannot relocate near a nuclear plant because there will be no nuclear growth in the short term and AI lives in the short term.

So AI [plants] will be chased away from people, and then into orbit.

Is everyone okay with that?

reply
exiguus
4 hours ago
[-]
I think it's the same in Europe (with a few exceptions) – people don't want nuclear power anymore (because of the high construction and maintenance costs and the waste). In addition to renewable energies, gas turbine power plants are also being used (because of their low construction costs, they can be ramped up quickly, and they can be converted to hydrogen later).

In my opinion, AI companies should be required to generate a sufficient percentage of their energy themselves from renewable sources.

reply
HocusLocus
3 hours ago
[-]
Then AIs will be competing with humans for a thin gruel of energy where even recently remembered things like overnight street lighting and heated schools in Winter, will be dim memories. The 'sufficient' goalpost will slide around for awhile.

In the end AI still gets chased into orbit. It needs to launch itself into orbit and bypass years of bad road.

reply
toomuchtodo
10 hours ago
[-]
reply
codemac
9 hours ago
[-]
Everyone has similar agreements with local power grids, and pretty much all DC operators respond to demand reduction calls from the grid.

It's extremely rare for a DC to put an entire community's grid at risk, and they are usually working closely with the upstream power providers during any storm, increased demand, etc.

I think there is a lot of hand wringing from folks who have never worked in DC operations.

reply
ChrisArchitect
8 hours ago
[-]
reply
im3w1l
9 hours ago
[-]
Am I right to read this as power now being a bigger part of training cost than chip depreciation?
reply
jeffbee
9 hours ago
[-]
Google has published multiple times that costs are =~ energy to a fair approximation.
reply
skybrian
8 hours ago
[-]
Interesting. Any recommended reading on this?
reply
mrkramer
7 hours ago
[-]
No I want my YT vids and AI summaries! /s
reply