About me Reading list GitHub
frontmatter.Title

The perfection of digital serfdom

Saturday, Aug 16 2025

TechnologyOpinion

Contents

  • Introduction
  • Modern Feudalism
  • With AI
  • Why platforms need AI
  • The myth of a permanent holiday
  • Conclusion

Introduction

One of the things that seem to get the pro-AI camp excited about the future is the idea that we will all be able to delegate our work to AI. It's a noble dream, but the future rarely ever plays out the way we plan it. In this post, I'll explain why I think most progress from generative AI is likely to benefit a very small portion of the population.

Modern Feudalism

I recently read Yannis Varoufakis' book, Technofeudalism, and it has, to a great extent, shaped my argument here. I'll leave a more detailed exploration of technofeudalism to Varoufakis, but the gist, as I understand it, is that we've moved past capitalism into a platform-centric system that more closely resembles feudalism.

The platform revolution made many promises. Chief among them was that we'd be able to create and express ourselves, share our thoughts and ideas, and make money without having to worry about the technical side of reaching an audience. They also promised that we'd be able to by-pass "traditional", elitist media, and that would cultivate a more diverse web. From pictures and blogs to video, live or otherwise, the promise was that we'd be able to do it all on infrastructure that someone else built and maintains, and we'd only need to worry about production. In exchange, they'd take a share of the revenue generated by your content. If that sounds familiar, it's because that's more or less how feudalism worked. The nobles owned the land, and the serfs worked the land. From whatever they produced, they had to give a share of it to the landlord as rent.

The problem with this system is that the serfs may imagine themselves to be free. If I were a YouTuber making a living off my content on the platform (or a writer on Substack, etc.), I'd see myself as somewhat independent. And, to be fair, I would be to a certain extent. However, there's a sense in which I'd still be beholden to the platform. YouTube, for example, is notorious for its itchy trigger finger when it comes to demonetisation based on certain key words, regardless of context. In extreme cases, creators have been booted off platforms where they had built a platform and on which they were reliant for their livelihoods. At the end of the day, the landlord owns the land, and you have the right to work on it and make a living insofar as the landowner allows it.

With AI

So, it's bad enough that you don't own the land, and your livelihood depends on the landowner allowing you to keep working the land to provide for yourself and your family. Your position is precarious, but in the worst case, you still maintain the option to jump ship to a different platform and keep doing what you have been doing. If your followers are invested enough, they might even follow you.

Now, imagine that the same landowner comes along and offers to provide all the expertise and tools to work the land. I wouldn't blame you for thinking this sounds like a pretty sweet deal. You don't need to worry about land ownership, or the tools needed to extract value from the land. All you need to do is show up and collect. The sceptic in me says this sounds way too good to be true.

AI-generated content

I have seen, recently, posts on LinkedIn that follow the same cookie-cutter template. They are easy to spot. Some of the dead giveaways include:

  • "it's not x, it's y"
  • starting with a quote from a conversation one apparently had I have also seen another account advertising a tool for generating "authentic LinkedIn posts".

It occurred to me that, at some point, all platforms will just have some kind of "generate" button. Users will only need to describe what they want to post, and something will be generated, supercharged by an algorithm that "knows" exactly what kind of content works best on the platform. In such a world, I can be a content creator who owns neither the content I create, nor the platform on which I share it. And, because this will be generic, AI-generated slop, what these platforms becomes, as David Bushell calls it, "mediocrity as a service".

Why platforms need AI

I can't say I don't understand the platforms' perspective. They probably recognise that they are overly dependent on user-generated content. The network effect helps, but it's not a position any business would like to be in knowing that all it takes is one major scandal that changes public perception to tank the platform. It would start with an exodus of creators. If all the people giving the platform value decided to leave, there would be no Facebook or Instagram.

I believe volume is the other reason. These platforms need users to create more and more content so that they can show more ads and drive user engagement and make more money, etc. That's why platforms encourage regular posting, whether it's through gamification techniques like streaks (e.g. Snapchat), or algorithmically boosting frequent posters (e.g. YouTube). But, a single creator can only post so much before they run out of ideas, burn out, or people just get bored of their content. Additionally, a technical barrier to entry still exists for content creation. The platforms stand to gain a lot from technology that makes it easier for anyone to post regularly, even if the content itself is of inferior quality.

At this point, you may start to wonder why they don't just create a slop pipeline that automatically generates and posts content without any human input, effectively cutting out the creative middle-man. I think the answer lies in credibility. There's a very thin line between a human using AI to generate a ton of content and a machine that automates the creation process. Having a human face sanitises the entire process and, at least, gives users the impression that what they are seeing is the self-expression of someone just like them. Additionally, the former at least gives you some hope of being able to keep up with the rate of production, whereas the latter makes it feel like a hopeless endeavour.

The myth of a permanent holiday

Circling back to my original point, the company line from techno-over-optimists is that one day we'll all be sitting back on a beach somewhere, sipping martinis, and occasionally popping in to check on our "agents" whose labour funds our lifestyles. I just don't buy it.

If I was in their position, i.e. I own the land (the platform) and the means of extracting value from the land (the ability to generate the content), then why do I need to share the spoils with "creators" without whom I could still generate content? I mentioned the need to keep a "human face", but the reality is that Instagram, for example, has already seen an explosion of "AI influencers", and people still follow these accounts. We are all content addicts, and most people really do not care whether they are following an account whose content is partly or fully AI-generated.

Similarly, in a world where anyone can wield AI agents to do anything, what service can anyone sell? Everyone says "new careers will be created that we don't even know yet" as if just wish new jobs into existence. What exactly will my agents be able to create that your agents cannot? Unless I'm Google, Meta, or OpenAI with billions of dollars worth of hardware and talent at my disposal to train bespoke, advanced models, the rest of us will probably be on an even playing field with equally powered tools.

I guess the only thing that will be able to differentiate agents will be capital. If we're all just outsourcing everything to AI, there will be a difference between someone who can pay for access to the best models, and someone who relies on the free ones.

At risk of presenting a false dichotomy I see two possible outcomes, both of which are worryingly dystopian.

The first is where human labourers and creators are slowly squeezed out because we just can't keep up with the machines. It might take me up to a week to work on and publish a blog post; a Claude-powered slop merchant can produce dozens in a day. And, they can be optimised to use all the right keywords and phrases to game the system and maximise SEO. The grim reality is that human creators wouldn't stand a chance.

The other is that, once we have all been de-skilled, these platforms will turn around and start selling these tools at a steep price. Imagine how much Cursor would charge if no one knew how to code. In fact, they might just as easily be a victim if OpenAI/Anthropic on whose models they rely, decided to make their own AI-powered code editor. The barrier to entry will become financial; as in, can you afford the latest and most powerful model, and can you afford the tokens required to sustain your livelihood? And, you'd have no way out of this trap because any time you try to spend learning skills to differentiate yourself and make less sloppy content, your rivals will be doubling down on AI and being rewarded by the algorithms for it.

But, all this will depend on the general public being trained to accept and maybe even glorify mediocrity. We are caught in a whirlpool that is slowly sucking us into the deep, dark depths of sameness, and soon the average person will be incapable of distinguishing between mediocre and great. Or, even worse, they'll be so hooked on it they won't even care.

Conclusion

I think the biggest danger that generative AI poses is that it promises to democratise while, in reality, it consolidates. Yes, we will all have the power to generate photo-realistic images and videos, and text and audio, but we'll only be able to use a handful of tools to do so. Behind all these AI startups are a handful of companies that train the AI models, and their outputs are recognisable.

These digital landlords have coughed up the required capital to build, power, and distribute these tools. Generative AI models are very expensive to both train and run. The fantasy that we'll all be able to just sit back and have AI do our work is crushed by the realisation that we're still in the loss leader pricing phase, and none of this is economically sustainable at this price point.

What's more likely is that, eventually, the prices will be jacked up, and we'll all be paying through the nose for access. And, even then, there will be nothing to stop these companies from keeping the best models to themselves, or providing them through proprietary tools.

It's digital serfdom perfected.

If you enjoyed this post, consider sharing it with a friend and subscribing to the RSS feed. You can also come and say hi on Mastodon.