The Ouroboros Industry

The ancient symbol of a serpent consuming its own tail meant renewal. Eternal return. The cycle that sustains itself. The tech industry turned this idea into a business model, and for the past 30 years Silicon Valley has bust and boomed its way through many cycles of renewal.

The disruption story always had clean edges. Move fast. Automate the old thing. The workers who couldn’t adapt got left behind, and the workers who could build the next thing became the new industry. Progress compounded and disruption was rewarded with investment, large exits and billions of dollars in shareholder value. The story had a satisfying arc because the people telling it were always on the right side of it. But the serpent has turned, and the people watching it happen most closely are the ones who built the mouth.

What’s strange about the current conversation on AI and jobs is who’s having it. The loudest voices on mass unemployment are almost entirely coming from inside tech: developers, designers, Product managers, analysts, the people most active on the platforms where this debate plays out. They’re not wrong about what they’re experiencing. But they’re not a representative sample of the global workforce either. The warehouse worker isn’t writing threads about logistics AI. The nurse isn’t on LinkedIn debating agentic workflows. The plumber isn’t worried at all.

The people generating the most noise happen to be the ones whose specific jobs sit in the one corner of the economy where AI has a credible near-term case. And those are the people with the platforms, the vocabulary, and the time.

The Fear Factory

A pitch that AI makes developers 40% more productive is a decent efficiency story. A pitch that AI is coming for entire categories of knowledge work, that the economy as currently structured may not survive contact with this technology, makes every CTO and board in the world stop and pay attention. Fear moves budgets in ways that productivity stats don’t. The jobs narrative serves two audiences at once: it validates the anxiety of the workers feeling the pressure, and it markets the tools causing it. Both functions run simultaneously, from the same posts, at no additional cost.

The Frankenstein version of this story would have a monster. A creator who underestimated what they’d made, a moment of genuine reckoning, some horror to give the arc its shape. The actual version is less dramatic and ironically worse. Nobody underestimated anything. The founders who built these tools understood what they were building. The investors who funded them understood what they were funding. The developers who used AI assistants to get faster understood what was going to happen to the entry-level roles. Nobody lost control of the creation. They just happened to be employed by it when the bill came due. The monster didn’t escape the lab. The lab went public at a $29 billion valuation.

In early 2025, Altman posted a note on X thanking software developers. Genuinely, warmly, at length, for building the internet, for writing the code, for creating the ecosystem that made everything OpenAI built possible. He meant it. That’s not the disturbing part.

The disturbing part is that he meant it, and it didn’t register as connected to anything else. Not to the compressed salaries. Not to the junior roles disappearing. Not to the developers reading his post on the same platform and trying to work out what their next move is. The gratitude and the damage exist in the same mind without ever touching. There’s no cynicism in the post. There’s no awareness that a different kind of person might read it and feel something other than warm.

That’s not a blind spot. A blind spot implies something was obscured. This is a particular kind of moral architecture, one where the abstraction layer between decisions and consequences is thick enough that the humans underneath stop being legible. Developers become “the community that built the ecosystem.” The ecosystem becomes an asset class. The asset class restructures the labor market. The labor market is a macro problem, a policy problem, someone else’s problem. At no point in the chain does the loop close.

The ouroboros doesn’t know it’s eating itself.

That’s not a metaphor. That’s a diagnosis.