OpenAI claims "well more than $13 billion" in annual revenue. Edward Zitron says that's nonsense. In a sprawling report on his Where's Your Ed newsletter, he alleges OpenAI inflates revenue using what he calls "stylized math," multiplying a four-week revenue window by 13. The real number, per CNBC's own sourcing, is closer to $10 billion before Microsoft takes its 20% cut. Zitron's Azure sources put first-half 2025 at roughly $2.27 billion, far below the $3.44 billion The Information reported after accounting for Microsoft's share.

The spending promises are wilder. OpenAI told investors its compute target sits around $600 billion by 2030. Microsoft's quarterly operating expenses total about $42 billion. Even if OpenAI could find the money, the chips don't exist. The global semiconductor industry generated roughly $527 billion in total revenue in 2023. TSMC's CoWoS packaging capacity, essential for NVIDIA's H100 and Blackwell chips, remains strained with lead times stretching months. ASML's EUV lithography machines cost over $200 million each and take up to two years to build. You can't manufacture your way to $600 billion in compute when the supply chain physically can't deliver.

Then there's the media problem. Zitron draws an uncomfortable parallel to FTX coverage, arguing outlets like CNBC laundered Sam Bankman-Fried's reputation before his collapse and are now doing the same for Altman. Reporters promoted Stargate Abilene as "online" when only a fraction of its capacity existed. Oracle has reportedly built just two of eight planned buildings meant to be operational by year's end. The $300 billion Oracle deal and $138 billion AWS agreement exist mostly on paper.

The math is hard to ignore. OpenAI likely loses between $4 billion and $10 billion annually to generate its current revenue. Partners like SoftBank, CoreWeave, and Oracle have real exposure if OpenAI can't meet its commitments. And CFO Sarah Friar reportedly opposed rushing toward an IPO. When your own CFO doesn't want to go public, that's worth paying attention to.