There is a word in Turkish - “feodal” - borrowed from the European concept of feudalism. The system where land was power, lords owned the means of survival, and peasants worked the fields in exchange for protection they never asked for. That system supposedly ended centuries ago. It didn’t. It changed costumes.
The AI economy is not creating a new economic order. It is accelerating the oldest one. A handful of entities control the means of production - not land, not factories, but compute, data, and models. The peasants have new titles now. They are called “users.”
The Cost of Entry Is the Moat
Training a frontier AI model costs hundreds of millions of dollars. The infrastructure to run it costs billions. The data to feed it was scraped from the collective output of humanity - your writing, your photos, your conversations - and concentrated into private vaults. This is not a market with barriers to entry. This is a market where entry is structurally impossible for all but a few.
Consider what this means. In classical capitalism, at least in theory, anyone with a good idea and some capital could compete. The garage startup myth had some truth to it in the era of software. Two people with laptops could build something that challenged an incumbent. That era is functionally over for foundational AI. You cannot compete with a company that has spent ten billion dollars on GPU clusters by being clever. Cleverness is necessary but insufficient when the game requires resources only five or six organizations on Earth possess.
This is feudal capitalism in its purest form. Not wealth through bloodlines, but wealth through computational monopoly. The lords don’t need swords when they have server farms.
The Labor Question Nobody Wants to Answer
Every economic transition in history displaced workers. The printing press displaced scribes. The loom displaced weavers. The assembly line displaced craftsmen. Each time, the standard response was: “New jobs will be created.” And each time, that was partially true - eventually, after decades of suffering that the optimists conveniently forgot.
AI is different in degree, if not in kind. Previous technologies automated physical tasks or narrow cognitive tasks. AI automates broad cognitive work - writing, analysis, coding, design, customer service, legal review, medical diagnosis. The question is not whether these jobs will be affected. They already are. The question is what replaces them, and who captures the value of the replacement.
Here is where feudal capitalism reveals itself most clearly. When an AI system replaces the work of a hundred customer service agents, the company saves millions. Where does that money go? Not to the displaced workers. Not to the society that educated those workers. Not to the collective whose data trained the model. It goes to shareholders. It goes to the already-wealthy. It concentrates further.
The defenders of this system will say: “But the company took the risk. The company invested.” This is the logic of feudalism with extra steps. The lord took the “risk” of owning the land. The factory owner took the “risk” of owning the machines. Risk-taking matters, but the current arrangement absurdly over-rewards capital and under-rewards the labor and collective knowledge that makes the capital productive in the first place.
The Open Source Question
There is one structural force that could resist this concentration: open source AI. If the foundational models are open, if anyone can fine-tune, deploy, and build on them, then the feudal moat shrinks. Power distributes. Innovation happens at the edges, not just in the castles.
This is why the behavior of major AI companies regarding openness matters enormously. Every move toward closing models, restricting weights, or gating access behind API paywalls is a move toward feudalism. Every release of open weights, every permissive license, every shared dataset is a move toward something closer to distributed power.
The trend is mixed. Some companies release open models. Others keep their most capable systems locked behind walls and subscription fees. The rhetoric around “safety” is sometimes genuine concern and sometimes convenient justification for maintaining control. Distinguishing between the two requires watching what companies do with the power their closed systems give them, not what they say about responsibility.
If AI remains predominantly closed and proprietary, the feudal outcome is likely. A small number of corporations will mediate access to the most powerful technology ever created, extracting rent from every interaction. If you want to use intelligence itself, you will pay a subscription to a lord.
The Governance Vacuum
Governments are not equipped for this. Democratic institutions move at the speed of committee meetings and election cycles. AI development moves at the speed of gradient descent and venture capital. By the time legislation catches up to a capability, three new capabilities have emerged.
More fundamentally, the regulatory apparatus is susceptible to capture. The companies being regulated are the same ones funding campaigns, hiring former regulators, and providing the technical expertise that lawmakers lack. This is not conspiracy. This is the documented, observable pattern of every industry that grew faster than its regulators. Tobacco, finance, oil, tech - the playbook is always the same. Fund the research, hire the regulators, write the rules yourself, and call it “partnership.”
The result is governance that protects incumbents while appearing to protect the public. Regulations that require expensive compliance - which only large companies can afford. Safety standards that entrench existing players by making it harder for newcomers to compete. The cycle repeats because the humans running it have the same incentives the system supposedly constrains.
The Merge That Determines Everything
There is a deeper transformation happening beneath the economic surface. Humans are beginning to merge with AI. Not in the science fiction sense - not yet - but in the practical sense that cognitive work increasingly happens in collaboration with AI systems. The person who writes with AI assistance produces more than the person who doesn’t. The developer who codes with AI ships faster. The analyst who reasons with AI sees patterns that unaided cognition misses.
This merge is accelerating. It will likely move from external tools to always-on cognitive extensions to, eventually, direct neural integration. The economic implications of this are staggering and mostly unexamined.
If the merge happens through proprietary systems controlled by a few companies, then feudal capitalism doesn’t just win the economy - it wins cognition itself. Your enhanced thinking would be mediated by a corporate product. Your competitive advantage in the labor market would depend on which subscription you can afford. Intelligence itself becomes a stratified commodity.
If the merge happens through open systems, distributed and accessible, something different becomes possible. Not utopia - that word is for people who haven’t read history. But something less captured. Something where the enhancement of human capability doesn’t automatically concentrate power further.
What History Suggests
History suggests the feudal outcome is more likely. Every previous technological revolution was eventually captured by existing power structures. The internet was supposed to democratize information - it created the largest information monopolies in history. Social media was supposed to give everyone a voice - it created the most sophisticated systems of attention manipulation ever built. Cryptocurrency was supposed to decentralize finance - it recreated Wall Street’s speculative dynamics with less regulation.
The pattern holds because human nature holds. Those with power, resources, and low-friction access to new systems will adapt faster than those without. The lords don’t fight the revolution. They fund it, shape it, and emerge on the other side wearing new clothes.
Breaking this pattern would require something that has never happened in human history: a technological transition where power actually distributed rather than concentrated. The open source movement in AI is the closest thing to a structural force that could make this happen. Whether it will is uncertain.
No Conclusion, Just Observation
The AI economy is not a new chapter. It is the latest page in a very old story. Resources concentrate. Power follows resources. Those without power labor for those with it. The technology changes. The arrangement doesn’t.
The question worth sitting with is not “will AI create jobs or destroy them” - that framing is a distraction. The question is: who will own the intelligence infrastructure that mediates all economic activity, and what does that ownership mean for everyone else?
The feudal lords of the 12th century would recognize today’s arrangement immediately. They would just be confused by the jargon.