Like Harold Wilson and his ill-defined ‘white heat of technology’, Keir Starmer has latched on to artificial intelligence as the saviour which is finally going to jolt Britain’s sluggish economy into growth. He once even suggested it would help fill potholes. A year ago he launched his AI Opportunities Action Plan, which is supposed to give the industry a huge boost through the designation of ‘AI Growth Zones’.
But there is a big hole in Starmer’s plans. How are we going to power an industry that has become as voracious in its energy needs as the steel, shipbuilding and other heavy industries which it might one day replace?
The high energy consumption of AI might not seem obvious to anyone playing around with ChatGPT. It all seems so clean and modern. Indeed, until a decade ago, energy consumption wasn’t really a big factor in the evolution of tech. Computing power increased dramatically, and the internet grew ever bigger, yet energy efficiency was able to keep up with it, with the result that the total energy consumption of the sector was largely flat. At the turn of the century, you may have had a desktop computer with a large and noisy fan trying to keep the components cool. Now you probably have a quiet, slim laptop which can do far more processing while generating far less heat.
In the past few years, that has all begun to change dramatically as the sophistication of AI models runs far ahead of efforts to reduce their power needs. Last year, a study published in the MIT Technology Review illustrated the explosion in energy use.It estimated the consumption of Meta’s AI model Llama 3, introduced in July 2024, when asked to come up with a travel itinerary for a trip to Istanbul. This used enough electricity to ride an e-bike for six feet. Meta also introduced a larger version of the model. To do the same job – but better, one hopes – it consumed enough electricity to ride the same bike 400 feet.
MIT also looked at a Chinese AI model called CogVideoX. To produce a five-second video on the original version, introduced in August 2024, consumed one e-bike-mile’s worth of electricity. Three months later, a newer, improved version was consuming 38 e-bike-miles’ worth.
The demands that AI is about to place on the grid will be extraordinary. The Trump administration’s Stargate Project plans to invest $500 billion over the next four years building ten data centres in the US which could consume up to 50 gigawatts of power between them. That is nearly twice Britain’s average energy consumption last year. If we really are going to be a major global player in AI, we are going to have to produce an awful lot more power from somewhere – and to do so reliably at all times of day, not just when the weather is sunny and windy.
The trouble is that Britain’s capacity for generating electricity is going backwards. Clean or unclean, our power stations combined are producing only three-quarters as much as they were in 2000. To put it crudely, for every extra gigawatt-hour of energy we have produced from wind and solar, we have cut two gigawatt-hours of output from our coal, gas and nuclear power stations.
If Britain really is going to be a major global player in AI, it is going to have to produce an awful lot more power
Why haven’t the lights gone out? Firstly because demand for electricity has fallen: industry is consuming 28 per cent less electricity than in 2000, largely on account of the decline of heavy industry, while domestic consumption has fallen by 16 per cent. A big factor in this is more efficient appliances and in particular LED light bulbs, which use only a tenth of the power of a traditional incandescent bulb. The fall in demand for electricity has started to flatten, however, as electric cars, heat pumps and AI place new pressures on the grid.
Even so, British power generation has plunged far faster than demand, the gap being made up by electricity imported via subsea cables. In 2024, we imported 43.7 terawatt-hours of electricity in this way – around a seventh of the total supplied. The idea of subsea cables was supposed to be that the flow would be in both directions: we would export as well as import it, to take advantage of different patterns of demand. Yet with the exception of 2022, when the near-Continent was short of gas due to the Ukraine war and a number of French nuclear power stations were offline, we have ended up importing far more than we are exporting – four times as much in 2024.
What is the chance of Britain’s electricity system responding to increased demand from AI? The Future Energy Scenarios published by the National Energy Systems Operators (NESO – the new quango which oversees the electricity grid) predicts, the annual consumption of electricity in Britain more than doubling from 290 terawatt-hours in 2024 to between 638 and 947 terawatt-hours in 2050, depending on how the government chooses to go about meeting its net-zero target. It puts this down to the electrification of road transport and heating.
It doesn’t even mention AI, whose demand for power is growing far more quickly. It envisages massive expansion in wind and solar but also in energy storage capacity – from 37 gigawatt-hours’ worth in 2024 to between 150 and 200 gigawatt-hours’ worth in 2050. A report by the Royal Society in 2023 thought this to be a gross underestimate, however, arguing that if we are going to have a renewables-heavy grid, energy storage would need to grow 1,000-fold. Battery storage is not coming on nearly fast enough to cope with this. The vast majority of the energy storage we have at present is in the form of four pumped-storage hydro reservoirs in Scotland and Wales – facilities which require very mountain specific topography.
The Prime Minister may see AI as a quick fix for sluggish economic growth. But there is little to suggest that his government’s energy policy is even nearly up to the task of supporting it. Maybe he should ask ChatGPT for a solution.
Ross Clark’s substack, Ross on Why, is launched this week.
Comments