Sentry co-founder David Cramer wants to kill LLMs.txt before it takes hold. His alternative is HTTP content negotiation, a standard that has been in the HTTP spec since 1999.

The pitch is straightforward. When a server sees `Accept: text/markdown` on an incoming request, it can treat the caller as an AI agent and respond with markdown instead of HTML — no navigation chrome, no JavaScript, no visual scaffolding. One conditional branch. No new file formats. No dedicated endpoints.

Cramer has deployed the pattern across three Sentry properties. On docs.sentry.io, markdown responses strip browser-only elements, and index pages become structured sitemaps designed for agent traversal rather than human reading. That last detail is deliberate: frontier models tend to read only the opening of long files to avoid bloating their context windows, so Sentry front-loads the links agents actually need.

The main sentry.io site takes a harder line. Headless bots hitting the homepage get no HTML at all — just a markdown document pointing to three machine-friendly entry points: the Sentry MCP Server (OAuth-authenticated HTTP streaming), the Sentry CLI, and the REST API. The framing is blunt: don't give agents a login wall, give them a working entry point.

The third case is Warden, Sentry's AI-powered code review agent. A single curl request with the markdown accept header returns a self-bootstrapping document containing everything an LLM needs to configure and run Warden on its own. The agent is built on the Agent Skills standard — an open spec using portable SKILL.md instruction files, originally developed at Anthropic — and runs locally before commits or automatically on pull requests via GitHub Actions, posting inline findings with suggested fixes.

The argument for content negotiation over LLMs.txt comes down to maintenance. A static file describing your site goes stale; a header branch generates the appropriate response on demand from live content. It also requires no new tooling — `Accept: text/markdown` has been a valid HTTP header for decades. Cramer's contribution is less invention than demonstration: the pattern works at production scale, and the header is a workable signal for distinguishing agent traffic from browser traffic.