They Wanted to Leave the State. Instead, They Built Its Eyes.

by | Mar 17, 2026 | AI, Military, Sovereignty

The Defence Startup Dream

The ideology had a clean pitch. Technology was outpacing government. Nation-states were slow, bureaucratic, and designed for a world that no longer existed. The people who built the internet, the smartphone, and the cloud were not obligated to work within broken systems. They could build better ones. They could exit.

That was the promise. Here is what was delivered: the most capable surveillance and military contracting infrastructure in human history, built by the same people who said they were building alternatives to the state.

The Ideology of Exit

The idea that best captures this worldview borrows a term from the economist Albert Hirschman. Where ‘voice’ means trying to fix broken institutions by working within them, ‘exit’ means building something better and walking away. Silicon Valley’s libertarian wing enthusiastically adopted this framing, and it took several concrete forms.

Peter Thiel, the venture capitalist and PayPal co-founder, funded the Seasteading Institute, which explored building floating sovereign communities in international waters outside any nation’s legal reach. Balaji Srinivasan, a former partner at the venture capital firm Andreessen Horowitz (commonly known as ‘a16z’), published ‘The Network State’ in 2022, a detailed blueprint for communities forming online first, building shared identity and culture, then acquiring physical territory and eventually seeking diplomatic recognition. Charter city movements proposed building entirely new cities with new legal and governance frameworks, often in developing economies, as proof-of-concept for what post-state governance could look like.

The intellectual thread running through all of it was consistent: governments are the bottleneck, and technology is the solution. The people who build the tools have both the right and the capability to build the governance structures that come with them.

This was not fringe thinking. It circulated at the highest levels of the tech investor class, was discussed seriously at conferences, and attracted genuine intellectual energy from people who were not naive. It was also, in retrospect, almost perfectly wrong about the direction things would go.

What Was Actually Built

While the exit ideology was being refined in books and manifestos, something else was happening in the background. The same firms, founders, and networks were building the infrastructure of state power at a scale that governments could not have achieved on their own.

Palantir Technologies, the data analytics company founded in 2003, holds contracts with the United States Department of Defence (the ‘DoD’) worth more than $1 billion. It holds contracts with Immigration and Customs Enforcement (‘ICE’, the US federal agency responsible for immigration law enforcement). Its software has been used to track undocumented immigrants, to support military targeting operations, and for intelligence work whose details are not public. The company’s revenue is growing by over 30% year-on-year. The company is named after the palantiri, the seeing stones from J.R.R. Tolkien’s novels, one of which Sauron captured and used to corrupt everyone who looked into the others. The founder has a doctorate in social theory. He has read the books.

The pattern extends well beyond one company. In early 2026, the Pentagon designated Anthropic, an artificial intelligence (‘AI’) research company, as a supply chain risk after Anthropic maintained ethical limits on the use of its technology for lethal autonomous weapons without safety verification. Within hours, a competitor signed a contract with the same Pentagon. The speed was the message. The market for military AI does not pause for ethical objections. It accelerates through them.

Andreessen Horowitz, the same firm associated with the network state movement, runs a fund it calls ‘American Dynamism’. The fund’s focus is on military and defence technology. The thesis, which has proven extremely profitable, is that defending the nation is a market opportunity. The AI companies that decline to work with the military are, in this framing, naive. Naive in the sense of failing to understand where the leverage in this situation actually sits.

What This Means If You Are Building in Defence Tech

The current wave of defence technology startups is being actively recruited into this story, and it is worth being clear-eyed about the narrative being used to do it.

The ‘American Dynamism’ framing, and its equivalents in the United Kingdom and across Europe, present founders as participants in something heroic. You are not just building a product. You are defending civilisation, enabling sovereign capability, and contributing to national resilience. This language is everywhere right now, in pitch decks, in fund theses, in government procurement strategies.

It is not entirely wrong. The defence and national security sector genuinely needs better technology, and the argument for deeper industry-government collaboration is a real one. There are companies doing important, legitimate work in this space, and the gap between what modern militaries need and what legacy defence primes (‘prime contractors’, the large, established companies that have historically dominated defence procurement) can deliver is wide enough to build serious businesses in it.

But the heroic framing tends to skip a set of harder questions. Who controls the technology once it is deployed at scale? What are the limits on its use, and who enforces them? What happens when a government customer asks for a capability that sits outside what the founder originally intended to build?

The company that maintained red lines on autonomous weapons got designated a supply chain risk. The one who did not get the contract. That sequence is worth sitting with before you sign.

For founders building in this space, the ideology of exit turns out to be beside the point. You are not replacing state power. You are extending it. That is not necessarily the wrong choice, but it should be a conscious one, made with clear eyes, rather than something that happens quietly in the background while you are telling a story about disruption and sovereignty.

The Philosopher Who Narrates

There is a cynical version of this story and a tragic version, and they are not mutually exclusive.

The cynical version is simple. The exit ideology was always the marketing. The military contracts were always the product. The network state manifestos and Seasteading Institutes gave a generation of founders a heroic self-image while they built the most powerful instruments of state control in history. The charter was the cover story.

The tragic version is more interesting. The people who genuinely believed in tech sovereignty may have been right that governments were slow and broken. They were wrong that the solution was to build outside them. What they built got absorbed, acquired, or replicated by the state anyway, often with their enthusiastic participation, because it turned out that scale requires customers, and the biggest customer in the world has a defence budget and is not going anywhere.

Palantir named itself after a corrupting surveillance device from a fantasy novel about the seduction of power. The founder read the books. He named it anyway. Then he built the thing the name describes, sold it to governments, and the stock went up.

He has described himself not as a threat, but as a narrator. A narrator, because what he is describing is already happening, and the description is more useful to him than a warning would be.

That might be the most honest thing said in a sector that has spent fifteen years telling itself a story about freedom. The exit was always an entrance. The seeing stones were always going to end up in the wrong hands. The philosopher read the books and built them anyway.

The question worth asking, if you are a founder in this space right now, is which part of the story you are in?

Written by Seb Matthews

Author, speaker, and advisor on leadership under pressure and organisational performance.

Related Posts

Fidelity Over Distance

Fidelity Over Distance

The problem in defence software delivery was never really about distance. It was always about how much of the real problem survives the journey from operator to engineer. LLMs change what fidelity is achievable when physical proximity is not possible.

read more

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *