Barnes Tech

Back

I reference enshittification a lot on this site. Apple Maps ads, Perplexity’s surveillance pivot, Google Search getting worse — it keeps coming up because it keeps happening. This week, OpenAI started testing ads in ChatGPT, and one of their own researchers, Zoe Hitzig, resigned over it, warning that the most detailed record of private human thought ever assembled is about to be monetized with advertising. We’re watching the next wave of enshittification happen in real time with AI, so I figured it was time to actually explain the concept properly since I keep referring to it.

Cory Doctorow coined the term. He’s a novelist, activist, and has worked with the Electronic Frontier Foundation for over two decades. The American Dialect Society named “enshittification” the 2023 word of the year. It describes the predictable lifecycle of how platforms decay, and once you see the pattern, you can’t unsee it.

The Three Stages#

Doctorow defines enshittification as a three-stage process of platform decay.

Stage one: Be good to users. A platform starts by offering genuine value to attract users. Facebook in 2006 told everyone to leave MySpace because Rupert Murdoch spied on them. “Come to Facebook, we’ll never spy on you, we’ll just show you what you ask to see.” Users pile in, and critically, they get locked in — you love your friends, and you can’t convince all of them to move somewhere else at the same time. As long as you love your friends more than you hate the platform, you’ll stay.

Stage two: Sell out users to business customers. Once users are locked in, the platform makes things worse for them to attract advertisers and business customers. Facebook goes to advertisers and says “remember when we told users we wouldn’t spy on them? Obviously that was a lie. We spy on them from ass to appetite. Give us money and we’ll target ads with exquisite fidelity.” They go to publishers and say “put your content on our platform, we’ll cram it into eyeballs that never asked to see it. Free traffic.” The business customers pile in and get locked in too — they become dependent on the platform.

Stage three: Extract all value. Now the platform makes things worse for the business customers too. Ad targeting fidelity goes down. Ad prices go up. Ad fraud explodes. Procter & Gamble in 2017 zeroed out its $200 million surveillance advertising spend and saw zero drop in sales, because to a first approximation all those ads disappeared down the fraud hole. The platform harvests everything except the bare minimum needed to keep people from leaving. Your feed becomes a homeopathic residue of stuff you actually asked to see, filled with things people paid billions to show you, and they’re getting robbed blind. This is ideal for the platform. It’s a giant pile of shit.

Why It Happens#

There’s a tempting but wrong explanation for this: “If you’re not paying for the product, you’re the product.” Doctorow pushes back on this hard. The people who do pay — the advertisers, the publishers, the business customers — get shafted too. Everyone gets worse treatment except the platform itself.

The real answer is policy. Four things used to discipline tech companies: competition, regulation, interoperability, and an empowered workforce. All four eroded over the last two decades.

Competition disappeared as the industry consolidated. When you have hundreds of equally sized companies, none of them can conspire to screw users because someone will defect and steal the others’ customers. When you have a handful of giants, they don’t worry about that.

Regulation got captured. Concentrated industries are much better at lobbying than fragmented ones.

Interoperability got outlawed. This is the big one that non-technical people underappreciate. Interoperability means you can make one thing work with another — any light bulb in any socket, anyone’s gas in your tank. With digital technology, you get even more opportunities because you can always write software that undoes what some other software tries to do. An ad blocker, a third-party printer cartridge, a tool to export your data. This is an enormous source of discipline because once someone installs an ad blocker, the revenue from that user falls to zero forever. No one ever uninstalled their ad blocker.

But laws like Section 1201 of the DMCA (1998) made it a felony to bypass a digital lock — five years in prison and a $500,000 fine for a first offense, even if no copyright infringement takes place. Companies started wrapping everything in a thin layer of digital lock. Your printer cartridge has a chip that does a cryptographic handshake to prove it came from HP. Ink is now $10,000 a gallon. Your audiobooks on Audible are wrapped in DRM so you’d forfeit everything if you left the platform. Your tractor, your car, your insulin pump — all locked down.

Worker power collapsed. A quarter million tech layoffs in a couple of years destroyed the leverage that tech workers once had to push back internally when their bosses wanted to do something harmful to users.

It’s Not Inevitable#

Doctorow is insistent that this didn’t have to happen and doesn’t have to continue. The platforms were genuinely good once — that wasn’t our imagination. The same people running them then are running them now. What changed was the environment, everything that used to keep them in check.

The way out is restoring those four constraints. Of particular interest right now is interoperability. Every country adopted laws like the DMCA because the US trade representative made tariff-free access conditional on it. Those laws primarily benefit American tech giants. Other countries could repeal them, allowing domestic companies to build tools that jailbreak locked-down devices, export data from walled gardens, and create competitive alternatives. As Doctorow puts it: the first country that does it gets a durable advantage, and there’s going to be a race to the top.

The AI Wave#

This brings us to right now. AI companies are burning through cash at staggering rates, and the enshittification playbook is already in motion. Perplexity’s CEO announced their browser will track everything users do online to sell hyper-personalized ads. Google is stuffing ads into AI Overviews. And OpenAI, the company that positioned itself as a mission-driven AI safety lab, just started testing ads in ChatGPT.

Zoe Hitzig, who spent two years as a researcher at OpenAI working on AI models, pricing, and safety, resigned and wrote in the New York Times that OpenAI has the most detailed record of private human thought ever assembled, and is about to monetize it with advertising. People share their medical symptoms, relationship problems, financial anxieties, career fears — things they might not tell another human. The surveillance advertising model applied to that data is a different beast entirely from what we’ve seen with social media.

The pattern is clear. Stage one is already over — AI tools attracted hundreds of millions of users by being genuinely useful. We’re entering stage two now, where the platforms start selling out users to business customers. If history is any guide, stage three won’t be far behind.

As Doctorow says, the real culprits aren’t the ketamine-addled billionaires running these companies. They’re just filling a void created by policy. The policymakers who created the enshittogenic environment that guarantees people who do the worst things in the worst way will make the most money — those are the people who need to change course.

If you want to hear Doctorow explain it himself, here’s a great clip from The Daily Show, and I’d also recommend the full Organized Money podcast episode where he goes deep on all of this.

🔗
Links
What Is Enshittification?
https://barnes.tech/blog/what-is-enshittification
Author Barnes Tech Blog
Published at February 12, 2026