AI is quickly becoming a critical input to production, yet today the UK has very little stake in the AI value chain. At the same time, our economy is ~80% services, which likely makes the UK particularly exposed to automation. What unites the best people I’ve met working on UK AI strategy is that, even if they have doubts or disagreements about the precise timelines, they take this combination of trends — accelerating technical progress, high UK exposure, low UK value capture — very seriously.
So while AI scenarios are indeterminate, the directional choice facing the UK is clear. We can either:
Rapidly build our stake in the emerging AI value chain, enable economic growth, protect and widen our tax base, and retain geopolitical leverage and national agency, OR
Stand by while British labour is automated by foreign capital, lose control over the cost of and access terms for this newly critical economic input, and put our tax base, welfare system and social contract under even greater pressure
Either by default or by design, we are currently opting for (2).
Sovereignty is often confused or conflated with self-sufficiency, protectionism, or owning the entire AI stack. The goal is not to domestically produce everything: this would lead to low-quality, uncompetitive capabilities. But as automation becomes attractive and AI becomes a critical economic input, imported access to models may be throttled, weaponised or taxed — leaving the UK exposed to supply shocks, higher input costs and rules we have no say over. Without domestic capacity and control, labour automation will amount to offshoring and we could be left streaming our economy from abroad.
We’ve seen a version of this before, as software ate the world and everyone discovered the implications of deprecation. When Microsoft closed its ebook store, we learnt a new phrase: “the books will stop working”. Or as Alex Danco wrote in Everything is Amazing, But Nothing is Ours:
Worlds of scarcity are made out of things. Worlds of abundance are made out of dependencies. That’s the software playbook: find a system made of costly, redundant objects; and rearrange it into a fast, frictionless system made of logical dependencies. The delta in performance is irresistible, and dependencies are a compelling building block: they seem like just a piece of logic, with no cost and no friction. But they absolutely have a cost: the cost is complexity, outsourced agency, and brittleness. The cost of ownership is up front and visible; the cost of access is back-dated and hidden.
National sovereignty is about having the power to durably control our destiny. As a friend said: “we can rely on global markets for the supply of almost every good or service we might ever need. Digital brains might be one of the few exceptions.”
So we must build capabilities that limit our dependencies and generate strategic leverage — by controlling particular niches or chokepoints — in order to secure access to all required capabilities. This requires major reforms to energy, planning, and regulatory bottlenecks to deployment. And we must also urgently reposition the state to secure the social contract, by anticipating and preparing early for the coming changes to our tax base, welfare state, education system and defence capability.
Context
AI is quickly becoming a critical input to the economy: not only are technical benchmarks being surpassed daily, but the top AI companies are generating revenues approximately 4x times faster (FT) than the previous generation of leading software companies. This represents a large and accelerating shift in value capture which is likely to continue, as model improvement shows no sign of slowing.
The UK has very little stake in the new AI value chain: For decades, the UK’s economic model has relied on outsourcing core inputs and focusing on value-adding services. Now, we’re experiencing nearly 20 years of wage stagnation just as AI threatens to automate our remaining strength in services. Yet even with GoogleDeepMind in London, value is largely realised and taxed in the US. We have very little stake in the emerging AI value chain: energy generation, semiconductors, networking, datacentres, models, drones, robotics and applications.
Datacentre capacity is a key litmus test: As AI automates cognitive labour, domestic datacentre capacity is critical. The UK will want at least some AI inference (i.e. deployment), if not all, to run locally within its borders – e.g. due to military or regulatory sensitivities, or where low latency is particularly important (e.g. trading). Yet compute demand already massively outstrips supply, so middle countries with limited domestic capability — like the UK — are at real risk of missing out as leading nations secure their own supply first. Relying on other countries brings risks too: the US has already shown its willingness to use export controls for semiconductors, and tariffs more broadly, to put economic pressure on adversaries and allies alike. Given that you cannot ‘stockpile’ intelligence in the way you can for energy, if the UK wants control over its own economic production — a core feature of any sovereign state — then it must build domestic supply to avoid losing control over both the costs of, and access to, this newly critical economic input.1
The UK does have a large and growing base of datacentres built for cloud computing, some of which will be useful for AI.2 But a good test of AI seriousness is to look at the pipeline of supercomputing facilities which are particularly designed for AI training and/or inference. On that measure, only a handful of small facilities are live or planned in the UK, which is way behind not just the US, but comparable nations like France:
Where the UK needs to increase ambition
The UK is not yet taking the risk of losing this race — and the resulting risk of economic and geopolitical irrelevance — sufficiently seriously. It can do so by changing course in 3 ways:
1. Fixing energy and planning to accelerate datacentre build-out:
AI datacentres require lots of electricity: in just 3 years, AI datacentres are likely to draw ~50GW of continuous power in the US alone, roughly 1.5x the average electricity demand for the entire UK economy.
Yet the UK’s energy policy today is not focused on building abundant capacity or doing so quickly: instead, the goal is 95% grid decarbonisation by 2030 and a fully decarbonised grid by 2035. This is easier to achieve if you assume only a modest increase in electricity demand rather than significantly increasing capacity to enable entire new classes of always-on, energy-intensive compute.
To that end, NESO forecasts electricity demand growth of only 11% by 2030, mostly from EVs and heat pumps, while estimating that datacentre electricity demand will increase from 5 TWh today to 22 TWh by 2030. Yet, as Alex Chalmers writes, this is considerably less than what DSIT thinks is necessary: DSIT’s Compute Roadmap forecasts that the UK will need “at least 6GW of AI-capable data centre capacity by 2030” and that “demand could exceed this baseline significantly”. If these datacentres draw 6GW power continuously, this would amount to 52.6 TWh demand annually by 2030: more than double the 22 TWh that NESO is planning for.
Instead the UK should focus on energy maximalism — i.e. increasing electricity generation as much, and as quickly, as possible — in order for datacentres and other energy-intensive industries to be viable in the UK. This would also be a good way to distribute the benefits of AI: when electricity power becomes a bigger constraint than technical progress, countries that can supply power will capture significant economic value.
The other variable here is cost. Although cheaper electricity wouldn’t hurt, the major cost driver for large datacentre projects is the upfront capex, so developers actually tend to care more about speed and grid capacity (e.g. it currently takes approx. 15 years to get a grid connection) than price. To that end, nuclear power is probably the only way to massively increase electricity generation safely, cleanly, reliably and land-minimally (i.e. it’s good for land use efficiency and to reduce transmission costs). And it could also be much cheaper: while project mismanagement is rife, one of the main reasons that UK nuclear is 4.5x more expensive than in other countries like South Korea is due to disproportionate regulatory issues.
But electricity prices clearly do matter for wider sovereignty concerns: the UK has the world’s highest industrial electricity prices, averaging 4x US industrial electricity costs in 2023, as Jonno Evans writes. This gap plays a significant role in where companies invest, hire and build new projects, particularly affecting energy-intensive strategic industries like chemicals, steel, batteries and manufacturing.
Unfortunately, it looks like one of the main drivers of cost increases is our focus on grid decarbonisation, due to both increased wholesale prices of renewables and increased transmission capacity requirements (e.g. to transport electricity from generators in the North to users in the South). As someone who cares a lot about the climate, I am increasingly concerned that we are overly focused on decarbonising electricity, which represents only ~20% of total UK energy use, rather than prioritising cheap electrification of dirtier industries like transport and heating to reduce overall emissions. Maybe electricity is clean enough for now?

2. Clearing regulatory bottlenecks to enable AI training and diffusion:
As countries adopt AI, and reap the productivity gains at different rates, this will have an exponential impact on their economic power. So it’s important to start compounding early.
As such, clearing barriers to deployment is critical for the UK to accelerate economic growth and remain a relevant power with influence over AI’s future trajectory. In practice, there are 2 particularly important dimensions to this:
Copyright restrictions mean it is currently illegal to train a competitive AI model in the UK, yet unfortunately these restrictions do not even protect the UK creative industries, given that more liberal jurisdictions for model training exist elsewhere. Regrettably, the UK’s current policy is therefore a lose-lose. As Julia Willemyns has argued, the UK should therefore accelerate a Japan-style, commercial text and data mining exemption so that competitive models can be trained in the UK, in turn enabling us to support domestic value capture.
Elsewhere, regulatory bottlenecks hold back rapid, confident deployment of AI in health, transport, finance and many other sectors. We will need to unblock these bottlenecks via derogations or non-enforcement agreements, calling in regulatory decisions, or issuing new licences to firms where regulators are moving too slowly. So far, the Regulatory Innovation Office seems to lack the powers required.
3. Repositioning the state to secure the social contract
Britain is caught between a fractured national spirit and the onrushing upheaval of this new world. This is fertile ground for zero-sum, destructive forces who tear down more than they build. To avoid overseeing potential decline akin to the 70s and 80s, this government must get ahead and prepare.
While there are lots of uncertainties in AI, it is almost certain that the UK economy will undergo a major re-ordering much more quickly than most policymakers expect. This is likely to include significant labour market automation and major changes to the tax base. The government should prepare for this shift by:
Planning for major shifts in the tax base: given the likelihood of increasing labour automation (starting with young people and recent graduates), payroll taxes like income tax and national insurance may shrink as a share of government revenues, while corporation taxes may grow in importance. Yet this also emphasises the importance of retaining UK HQs, so that value (and tax) is realised in the UK rather than shifting outside the country. Taxing less mobile sources of value, such as consumption and/or land, should also be considered.
Welfare, university & healthcare reform: The welfare state is already creaking and the tax burden is high, due to demographic pressures, waste, and a failure across the state and NHS to adopt even internet-era technologies. The university system is already falling short for students, researchers and public finances, as Ben Reinhart set out in Unbundling the University. Now, as graduate jobs get automated, we face an overhang of universities failing to prepare young people for employment and saddling both them and the state with the expense. The only way to transform any of these public service at scale — to improve outcomes while reducing costs — is via bold new national institutions organised around modern technologies.
Defence and industrial capacity: Strategic assets in the AI value chain can only be leveraged if you can defend them. For example, in trade disputes with the US, Taiwan and the Netherlands are unable, respectively, to leverage TSMC (which fabricates semiconductors) and ASML (which makes the ultraviolet lithography machines critical to the semiconductor fabrication process) despite them occupying monopoly positions in the supply chain, because neither country controls their own defence. It is therefore critical to fix procurement and rebuild the UK’s defence and industrial base, particularly at a time of heightened geopolitical tensions and unreliable security guarantees.
This piece is intended a) as a personal exercise to develop and untangle my thinking on AI, energy, regulation and UK sovereignty, and b) to write down an argument that I think is often implicit, but rarely championed publicly. There’s a good chance I’ve been imprecise in parts or missed some key detail somewhere, so if I’ve got something wrong please let me know.
The combination of open source progress & on-device inference may offset some of these dependencies, but it’s early to say.
Request for reading recommendations: I’d like to learn more about differences in compute capacity for cloud vs AI workloads and the breakdown of existing UK datacentre capacity relative to those workloads. I also assume existing datacentre capacity is at high utilisation already. So perhaps in a crisis some sensitive AI workloads could be prioritised over other cloud applications, but we’d still be supply constrained to keep the entire economy running? Relatedly, I don’t think ‘sovereign compute’ necessarily requires public ownership: to increase total domestic capacity we will need many more privately owned facilities (e.g. by international hyperscalers). Meanwhile private investment is clearly important for wider economic growth, and being rich is great for sovereignty! But I’d like to read more about when and where this might matter or speak to anyone thinking about this.