rtrvr.ai just shipped AI Subroutines, a browser automation tool that records tasks once and replays them as callable LLM tools. Zero token cost on replay. Deterministic output. The whole thing runs inside your browser tab, not in some headless worker or cloud function. Authentication is the real blocker for web automation. Sites use rotating CSRF tokens, session headers, client-side request signing, fingerprint-bound parameters. Out-of-process scrapers have to rebuild all of that. And it breaks constantly. rtrvr's Chrome extension patches fetch and XHR in the main world before page scripts load. It captures requests as you perform a task. When the subroutine runs later, it dispatches from the page's own execution context. Same origin, same cookies, same TLS session. Auth just works because the browser does what it always does. But recording network traffic creates its own mess. A minute of browsing fires hundreds of requests: analytics beacons, feature-flag polls, telemetry junk. rtrvr ranks requests using weighted signals like first-party origin, temporal correlation to DOM events, mutation methods, response quality. Analytics hosts get a flat penalty. Volatile operation identifiers like GraphQL queryIds get filtered out because they break when sites redeploy. The top five requests survive and get packaged into a reusable subroutine. Luke Edwards, a former GitHub engineer and prolific Node.js contributor, built rtrvr.ai to bridge LLM reasoning with the authenticated web. The tool ships with preinstalled subroutines for Instagram, X, and LinkedIn, with plans for a community library of AI agents. Hacker News commenters were quick to note the catch: websites deploy CAPTCHAs and proof-of-work challenges to stop automation. Even if rtrvr sidesteps the session management problem, sites keep building new walls.