
AI-Techno-Feudalism is transforming the digital world as Big Tech controls AI, data, and infrastructure. Ownership is fading, replaced by access and dependency. As individuals and institutions rely on these systems, power concentrates at the top. Are we becoming digital serfs in a new era of technological control?
Contents
The Day the Cloud Went Dark
Imagine waking up one morning to find that your hospital cannot access patient records, your bank cannot process transactions, your university’s research tools have gone offline, and the AI assistant your company depends on for everything from legal drafting to supply chain management has simply stopped responding. No bomb was dropped. No election was stolen. A single cloud platform experienced an outage.
This is not a dystopian fantasy. It is an increasingly plausible scenario that reveals something profound about the world we are building. In 2026, Amazon, Google, Microsoft, and Meta are collectively pouring nearly $700 billion into AI infrastructure. Data centres the size of small towns are being constructed across three continents. Entire national economies are quietly, almost imperceptibly, becoming dependent on the computational decisions of a handful of Silicon Valley boardrooms.
We tend to think of these companies as tools useful, even indispensable, but ultimately in our service. That framing is wrong, and dangerously so. What is actually emerging is something older and more troubling than a monopoly. It resembles, in its essential structure, feudalism reimagined for the digital age, supercharged by artificial intelligence, and dressed in the language of innovation and progress.
Medieval lords controlled the land, the mills, and the roads. Today’s Digital Lords, a small cluster of technology corporations, control the computing power, the cloud platforms, the data pipelines, and increasingly, the AI models through which humanity accesses, synthesises, and produces knowledge itself. The rest of us individuals, businesses, universities, and even governments are becoming tenants in their digital fiefdoms, paying rent in data, attention, and subscription fees for access to infrastructure we neither own nor truly understand.
In this blog, we will examine how this infrastructure capture works, why AI represents an unprecedented form of knowledge ownership, how digital lordship operates in practice, and what, if anything, can still be done about it.
What Is Technocratic Neo-Feudalism?
Before going further, it is worth being precise about what this term actually means, because it is easy to dismiss it as hyperbole.
Technocratic neo-feudalism describes a structural shift away from the competitive capitalism of the twentieth century toward a system of rent extraction and platform dependency, in which a small number of technology companies act as the lords of digital fiefdoms. The economist Yanis Varoufakis has written compellingly about “cloud capital”, the idea that Big Tech has moved beyond selling goods or even services, and now owns the very infrastructure through which economic and social life is mediated. Those who depend on this infrastructure, which is almost everyone, become, in his term, “cloud serfs.”
The historical parallel is clarifying rather than merely rhetorical:
| Medieval Feudalism | Digital Neo-Feudalism |
| Control of land and physical infrastructure | Control of cloud, compute, and data centres |
| Serfs’ physical labour extracted as rent | Users’ data, queries, and attention extracted as rent |
| Lords set the rules of the manor | Platforms set terms of service and algorithmic governance |
| Church as ideological legitimiser | “AI safety” narratives as legitimising discourse |
| Vassals dependent on lords for protection | Startups dependent on hyperscalers for survival |
The “technocratic” dimension matters too. In classical feudalism, power was exercised openly and by named individuals. In the digital version, decisions are made by algorithms, optimisation functions, and closed proprietary systems. Accountability is diffuse. Redress is nearly impossible. The serf does not petition the lord; they submit a support ticket.
Big Tech Controls the Infrastructure — Not Just the Tools
The first and most important layer of digital lordship is physical infrastructure, which we rarely pause to think about.
Running a frontier AI model, the kind that can write legal briefs, synthesise scientific literature, or generate working code, requires an extraordinary concentration of computing power. We are talking about data centres occupying hundreds of acres, consuming as much electricity as mid-sized cities, cooled by water systems that strain local resources, and dependent on a supply chain for advanced semiconductors that runs through a handful of fabrication plants, most of them in Taiwan.
Only a few organisations on Earth can afford to build and operate this infrastructure. Amazon Web Services, Microsoft Azure, and Google Cloud, the three dominant hyperscalers, now function less like technology companies and more like digital utilities. Except, crucially, they are private utilities with no meaningful public accountability.
Consider what this means in practice:
- Governments across Europe, Asia, and the Global South are running critical public services on infrastructure they do not own and cannot fully audit.
- Universities and research institutions depend on cloud credits and API access granted at the discretion of private companies.
- Startups and small businesses build their entire operations on platforms where pricing, terms, and access conditions can change unilaterally overnight.
The lock-in is not accidental. Egress fees, the charges companies impose for moving data out of their cloud, make switching prohibitively expensive. Proprietary APIs and tooling create technical dependencies that compound over time. And as AI becomes central to business operations, the switching cost rises further still, because the model you have fine-tuned on your data, integrated into your workflows, and trained your staff to use is hosted on someone else’s servers, governed by someone else’s terms.
Data centres are not neutral infrastructure. They are the new castles, and the lords who own them are not renting out rooms. They are renting out the right to exist in the digital economy.
AI as Ownership Over Knowledge
This is the heart of the argument, and it deserves careful attention.
For most of human history, knowledge has been produced in distributed, largely open ways through universities, libraries, journals, conversations, and the slow accumulation of cultural wisdom. Even in the commercial era, the tools of knowledge production, such as books, calculators, and research databases, were owned by their users. You bought the encyclopaedia. You kept it on your shelf. No one charged you per lookup.
What AI has enabled is something qualitatively different: the enclosure of the knowledge commons.
Here is how it works. The major AI companies trained their frontier models on vast amounts of publicly available textbooks, articles, forum posts, scientific papers, code repositories, and much more. This material was produced by humanity collectively over centuries. It was, in the most meaningful sense, a commons. The models trained on this data are not. They are proprietary assets, their weights locked away, their inner workings opaque, their outputs monetised at scale.
When you query a large language model, you are not using a tool you own. You are sharecropping on someone else’s knowledge field. You bring the question; the lord keeps the harvest. And with every query you submit, every correction you make, every piece of your own work that passes through the system, you are contributing to the improvement of a model whose value accrues entirely to its corporate owners.
The economic structure this creates is one of inference as rent:
- You pay per token, per query, and per API call.
- The model improves with your usage, but you do not share in that value.
- The knowledge synthesised by the model, drawn from humanity’s collective output, is returned to you as a metered service.
- Dependency deepens over time as skills atrophy and workflows become integrated.
The consequences of this extend well beyond economics. When the synthesis and mediation of knowledge are controlled by a small number of private systems, the risks include epistemic homogenisation and a narrowing of the range of ideas, framings, and conclusions that feel natural or accessible. It includes the possibility of manipulated outputs aligned with corporate or political interests. And it includes a profound deskilling of human cognitive capacity as reliance on AI intermediaries grows.
In drug discovery, climate modelling, legal research, and educational assessment, AI systems are already becoming the primary interface between human questions and human knowledge. If those systems are privately owned, commercially motivated, and algorithmically opaque, we have not just built a new tool. We have handed over the keys to the library, and the librarian works for the lord.
How Digital Lordship Actually Works
The mechanisms of power in this system are worth naming explicitly, because they are often hidden behind the language of service and convenience.
Algorithmic governance is perhaps the most pervasive. The rules of the digital fiefdom are not written in legislation; they are encoded in recommendation systems, content policies, ranking algorithms, and terms of service that shift without notice or democratic input. Platforms decide what information is surfaced, what content is monetised, and what behaviour is permitted, with no meaningful accountability to the people affected.
Data rents and attention rents extend the extraction model beyond simple subscription fees. Every search, every click, every query trains systems whose value flows upward. Users are not customers in any classical sense; they are the raw material of a production process they did not consent to join.
The emerging class structure of the digital economy mirrors feudal hierarchies with uncomfortable precision:
- Lords: The executives, founders, and major shareholders of hyperscaler and frontier AI companies.
- Vassals: Dependent startups, SaaS companies, and enterprises that build on top of Big Tech infrastructure and remain commercially vulnerable to platform decisions.
- Artisans: Open-source developers, independent researchers, and technical communities who contribute labour that often ends up absorbed into proprietary systems.
- Serfs: Everyday users, gig workers, and communities whose data, attention, and economic activity fuel the system while they receive access, not ownership, in return.
The political dimension is equally important. Digital Lords are not passive economic actors. They actively shape the regulatory environment in which they operate, funding think tanks, deploying lobbyists, and, most cleverly, positioning their preferred regulatory frameworks as neutral safety measures. The current “AI safety” discourse, while addressing real concerns, has also conveniently coalesced around approaches that favour large, well-resourced incumbents and erect barriers to open-source alternatives.
Real-World Snapshots
The abstraction becomes vivid when you look at specific cases.
Microsoft and OpenAI represent perhaps the clearest example of digital vassalage in action. OpenAI, nominally a non-profit dedicated to the benefit of humanity, is now operationally dependent on Microsoft’s Azure infrastructure for training and deployment. In exchange, Microsoft gained exclusive access to GPT models, integrating them across Office, Bing, GitHub, and enterprise products, creating a sprawling ecosystem of AI dependency anchored in a single cloud platform.
Google’s ecosystem demonstrates vertical integration at a civilisational scale. Google controls the dominant search engine, the dominant mobile operating system, one of the three major cloud platforms, and, through DeepMind and Google AI, some of the world’s most capable AI research. A user navigating this ecosystem from an Android phone to Gmail to Google Docs to Gemini is, at every step, generating value for the same lord.
Amazon’s AWS underpins a remarkable share of the global internet, including many platforms nominally competing with Amazon’s retail business. Its expansion into AI through Bedrock and its investment in Anthropic further extend its infrastructure reach into the frontier model layer.
To be fair to the critics of this framing, some argue that what we are describing is simply monopoly capitalism in a new sector, and that the feudal analogy obscures more than it reveals. There is something to this. Unlike medieval serfs, users can, in theory, switch platforms, use open-source alternatives, or advocate for regulation. The relationship is not one of legal compulsion.
But the practical barriers to exit are enormous, and growing. The feudal analogy is useful precisely because it captures the structural dependency and rent-extraction dynamics that classical monopoly theory, focused on pricing power in discrete markets, does not fully address.
Consequences and What Comes Next
The stakes of allowing this system to consolidate are considerable.
Inequality deepens as the returns to AI infrastructure ownership accrue to a vanishingly small group of shareholders while the productivity gains are captured by platforms rather than workers. Democratic accountability weakens as more decisions about information, credit, employment, and opportunity are made by algorithmic systems owned by private entities. Epistemic risk grows as knowledge mediation centralises. And geopolitical vulnerability increases as nations discover that their digital infrastructure is controlled by foreign corporations subject to foreign laws.
Yet counter-movements exist, and they matter.
Open-source AI community organisations and projects like Meta’s LLaMA releases, Mistral, EleutherAI, and others represent a genuine alternative current. Open weights are not a complete solution: you still need compute to run large models. But they break the lock on the model layer, creating the conditions for genuine competition and scrutiny.
Regulatory initiatives, particularly from the European Union, are beginning to address platform dependency, algorithmic accountability, and data rights in ways that, if sustained and strengthened, could meaningfully constrain digital lordship. The AI Act, the Data Act, and the Digital Markets Act together represent the most serious legislative attempt to date to impose democratic accountability on this system.
Perhaps most importantly, there are growing calls for public compute infrastructure, national or multilateral data centres and AI resources that would give universities, public institutions, and smaller nations access to frontier-level AI without depending on private hyperscalers. This is the digital equivalent of public utilities, and it is an idea whose time is arriving.
The moats, however, are vast. Compute, data, talent, and capital have accumulated in the same handful of organisations to a degree that will not be easily or quickly undone.
The Digital Commons or the Digital Fief?
The central question of the AI era is not whether artificial intelligence is powerful. It clearly is. The question is: who owns it, who controls it, and in whose interest does it operate?
If frontier AI infrastructure remains in private hands, and if the data centres, model weights, inference pipelines, and knowledge synthesis systems of the future are treated as proprietary assets rather than shared resources, then we are building a world of rented cognition. A world in which thinking itself, or at least the tools most people use to think, is a service provided by Digital Lords in exchange for data, fees, and dependency.
The Digital Lords will not voluntarily relinquish this position. No lord in history ever has. But awareness is where resistance begins. Demanding public alternatives to private AI infrastructure, supporting open-source efforts, pressing for genuine algorithmic accountability, and refusing the passive role of the serf are all starting points.
The digital commons is still worth fighting for. The question is whether we will fight for it before the enclosure is complete.
Are we already serfs, or is there still time to reclaim the commons? The answer may depend on what we demand in the next five years.
Leave a Reply