I opened a website last week to check a single fact. Before I could read anything, there was a cookie consent modal, a newsletter popup, and a sticky header covering the top quarter of the page. The content was there somewhere, underneath all of it.
AI agents don't see any of that. They access the content directly — plain text, structured data, feeds. The cookie banner doesn't exist for them. The tracking script fires and hits nothing. The popup closes on nobody.
This isn't new. RSS was doing exactly this in 2002. The idea that content should be separable from its interface, readable without loading the whole theater of the page — that was the original bet of the indie web. Subscribe to the feed, get the words, skip the rest. The format never really went away; it just stopped being the main event.
Matt Webb wrote recently about services needing to become headless — releasing CLIs and APIs alongside the visual interface, so agents can act without navigating GUIs built for human hands. The interface becomes a brief brand encounter; the real access happens below it. Zack Kass has been saying something similar: the messy, human-readable web needs a structured layer underneath it. TXT, XML, feeds. Things machines can actually use.
For years, the simple website looked like a bad business decision. No tracking, no ads, no dark patterns — you weren't extracting value from visitors, you were just publishing. The attention economy won, or seemed to.
What AI agents need turns out to be exactly what the indie web already built. Clean markup, real content, no junk between the words and whoever — or whatever — is reading them.
I'm not sure what this means for the sites that optimized the other way. Maybe nothing changes for a while. But the web that agents see is simpler than the one most people built for the last twenty years.
Turns out that was always the better bet.