The short answer is yes — and it already does in many areas.
While Artificial Intelligence (AI) is often promoted as a tool for efficiency and sustainability, the sheer energy required to train, operate and maintain AI systems is now threatening to outweigh the savings they generate.
As Britain and the rest of the world rush to adopt AI — in finance, healthcare, logistics, and consumer technology — the environmental footprint is expanding faster than the benefits can catch up.
Why AI Uses So Much Energy
The Training Burden
Training a large AI system requires enormous computational power. Each time companies like Google, Meta, or OpenAI build or retrain models, they use hundreds of thousands of high‑powered processors running for weeks at a time.
According to data cited by the University of Cambridge Energy Policy Research Group (2025), training a single advanced AI model like GPT‑4 produces up to 500 tonnes of CO₂ emissions — around the same output as 60 average Britons in a year.
These “once only” training sessions are followed by continuous updates and retraining with new data, consuming more power each cycle.
Data Centres: The Digital Factories of AI
AI doesn’t live in the cloud — it lives in data centres, massive warehouse‑sized facilities loaded with servers, processors, coolant systems and backup generators.
The UK already hosts more than 500 data centres, primarily concentrated around London, Slough and Manchester.
The National Grid ESO (2025) reports that data centres now account for 2%–3% of national electricity demand, and this share could double by 2030, largely due to AI workloads.
Cooling systems alone can consume as much energy as the computing itself — a grim irony when “smart” technology is sold as “green.”
The Myth of Net Energy Savings
Small Local Gains, Massive Global Costs
AI systems deliver small, visible improvements — slightly lower fuel usage in logistics, shorter travel times in public transport, efficient heating in “smart homes.”
However, these benefits rely on AI processing data constantly, which shifts the energy load from households to remote servers.
So, while your household smart thermostat might save a few kilowatt‑hours a week, the data centre analysing millions of similar thermostats requires gigawatt‑hours to run globally.
The Jevons Paradox
This is the classic efficiency trap: the more efficient a resource uses energy, the cheaper and easier it becomes to use — and usage then skyrockets.
AI improves efficiency per task, but creates billions more tasks.
Voice assistants, image tools, automation systems — all “save time” by multiplying demand for computational power. The net result: far greater total energy consumption.
In short: the more AI we build to “save energy,” the more energy we end up burning.

How This Affects the UK Consumer
1. Higher Energy Costs
The cost of AI’s electricity use doesn’t fall from the sky; it passes through the market chain.
Tech companies running AI need vast power resources — and in the UK, where wholesale electricity markets are already strained, rising data‑centre demand translates into higher prices.
According to Ofgem modelling in 2025, the UK’s data‑centre power demand could raise overall electricity costs by 2–3% by the end of this decade. It’s small on paper, but significant over millions of households already facing price volatility.
Businesses deploying AI-powered tools for logistics, finance or energy systems will also pass on these costs to consumers in prices for goods and services.
2. Subscription‑Based “AI Premiums”
Many industries are quietly transferring AI energy costs to users through subscription renewals, “premium services,” or “AI‑enabled” versions of ordinary software.
This means customers end up paying twice: once through utility bills (for the energy powering the network infrastructure) and again through digital usage fees disguised as new features or smart upgrades.
3. AI’s Hidden Carbon Cost
Energy prices may stabilise, but carbon intensity poses another consumer burden.
As the British government strengthens environmental rules, companies may face carbon taxes tied to AI processing.
Businesses will not absorb these costs — they will pass them on through higher prices or “green levies.”
So, the consumer pays not just in money, but as a carbon offsetting participant, often without awareness or consent.
The Paradox of “Green AI”
Efficiency Technology That Consumes Itself
AI manufacturers promote energy‑efficient chips, smarter cooling systems, and renewable‑powered data centres.
Yet these improvements rarely match the growth rate of demand. Every layer of optimisation becomes an excuse to deploy more AI models, from driver monitoring to AI cameras in supermarkets.
By the time technology saves 20% of energy per operation, deployment rates may have risen by 200%.
Renewables Are Not a Free Pass
Even as the UK expands wind and solar generation, energy diverted to feed AI infrastructure reduces what’s available for homes and public transport.
The Energy Systems Catapult (2025) warns that “digital demand will compete with domestic use,” meaning power grids in dense regions like the South East might face pressure to prioritise commercial data traffic over household consumption during peak hours.
Cynically put: if AI becomes too “hungry,” your lights might flicker not from lack of renewable power — but because too many machines are busy analysing your spending habits.
What’s Driving the Excessive Energy Use
Corporate Competition
Major firms — Google, Amazon, OpenAI and Meta — are locked in an “AI arms race.” Each company trains bigger, faster, more complex models in the pursuit of leadership.
This redundant competition means each firm re‑trains similar models, duplicating data and energy expenditure globally.
The Alan Turing Institute (2025) called this the “redundancy crisis of intelligence” — multiple firms spending hundreds of megawatt‑hours chasing the same outcomes for profit.
Unregulated Expansion
Unlike heavy manufacturing, the digital sector faces no statutory energy quota.
While British steel mills must account for every tonne of CO₂, AI developers face only voluntary reporting.
This regulatory gap allows data companies to operate massive energy‑intensive systems without direct national oversight.

The Real‑World Consequences for Britain
Energy Infrastructure Strain
AI’s hunger for constant energy will challenge the National Grid, especially as electric vehicles, heat pumps and smart housing accelerate demand.
If AI continues to expand unchecked, by 2035 the UK could face peak load pressures equivalent to adding several new cities’ worth of consumption annually.
This would require costly infrastructure upgrades — all eventually funded by taxpayers or through household energy tariffs.
Rising Data Inequality
AI costs also create a digital divide. Big corporations afford high‑performance AI systems; smaller businesses face skyrocketing cloud‑service fees.
The Federation of Small Businesses (FSB) cautions that UK SMEs risk being priced out of automation entirely — widening the productivity gap between multinational tech users and local companies.
Environmental Impact
AI’s reliance on rare‑earth materials for chips, and the cooling water used in large data centres, adds a further environmental toll. Recent data‑centre operations in England consume millions of litres of water daily, prompting criticism from sustainability advocates.
This means the “smart future” may be environmentally dumber than it looks — technologically elegant but ecologically unsustainable.
Conclusion: AI as an Energy Glutton
AI promised efficiency; it’s rapidly becoming one of the largest consumers of global electricity.
In the UK, gains will appear modest — smarter logistics, optimised power grids, automated control systems — but the background energy footprint will continue to surge as new AI services compete for attention and data.
Consumers will pay the price through higher bills, hidden service fees, and environmental levies.
Ironically, the future Britain faces could be one where everyday appliances are “energy smart,” but the nation as a whole becomes more energy‑poor.
The cynical truth?
AI won’t save energy — it will spend it more efficiently on behalf of those who profit from it. The lights will stay on, but they’ll cost a fortune to keep that way.
References (UK‑Focused)
- University of Cambridge – AI and Energy Policy Research Group Report, 2025
- National Grid ESO – Electricity Demand Outlook 2025–2035
- Ofgem – Energy Market Review: Digital Impact on Prices, 2025
- Energy Systems Catapult – AI and Infrastructure Efficiency Study, 2025
- Alan Turing Institute – The Redundancy Crisis of Intelligence (2025)
- Carbon Trust – Reducing the Carbon Cost of Data Centres in the UK, 2024
Summary
| Factor | Outcome | Consumer Impact |
|---|---|---|
| AI computation and data centres | Massive energy demand increase | Higher electricity prices |
| Efficiency gains | Local or temporary | Quickly offset by global expansion |
| Corporate competition | Duplicated model training | Unnecessary carbon and energy cost |
| Policy gap | No hard energy caps on tech firms | Public burden through infrastructure and tax |
| Cynical verdict | AI uses more than it saves | Efficiency marketed, not delivered |
Final thought:
AI will indeed make individual systems leaner — but only while the nation gets hungrier to power them. The future isn’t a clean digital utopia; it’s a smarter version of the same energy‑heavy economy, dressed up with glowing green logos and a very expensive electricity meter.

















