Your $20/month ChatGPT subscription loses money for OpenAI. A lot of money. Shaun Warman estimates frontier AI companies eat a 4-7x loss per heavy user, with real compute costs running $80-150 monthly. The economics only make sense if you understand what's being sold. Recent updates to the OpenAI-Microsoft deal mean the company can no longer rely on a steady revenue-sharing stream to subsidize heavy users, shifting the focus toward enterprise business models. You're the training data. The $20 subscription is your entry fee to participate.
Every correction you make feeds the RLHF pipeline. Every edit, every follow-up. That human feedback separates great models from good ones in 2026, Warman argues in 'The Apprenticeship.' Base capabilities have become roughly fungible across providers. The moat is the feedback loop, and labs are burning billions in venture capital to keep it flowing.
The arrangement has an expiration date. Warman calls it the 'apprenticeship window' and pegs it at 3-5 years. It's closing because synthetic data is hitting quality parity with human input. Because agentic self-play lets models verify their own outputs. And because throwing more human feedback at these models keeps yielding less.
When the window closes, the $20 tier vanishes. Prices jump to $80-150/month. The strongest capabilities get gated behind enterprise contracts with five-figure annual minimums. Labs might even stop selling tools and become direct operators, running AI-powered law firms, consulting shops, hedge funds. This evolution aligns with OpenAI's decision to abandon its cloud exclusivity deal and expand its infrastructure options.