Cloudflare is rolling out support for shared compression dictionaries, a technology that can cut bandwidth for incremental web updates by up to 99.5%. The idea is straightforward: instead of re-downloading a full JavaScript bundle every time a developer ships a one-line fix, the browser tells the server what it already has cached. The server then sends only the diff. That 500KB bundle becomes a few kilobytes on the wire.
The Phase 1 beta opens April 30, 2026.
This matters because automated traffic is overwhelming the web. AI agents accounted for nearly 10% of requests across Cloudflare's network in March 2026, up about 60% year-over-year. Imbue's 100-agent testing swarm illustrates how this proliferation is happening. AI-assisted development also means teams ship faster. More deploys, more requests, heavier pages. Why AI Won't Kill Your CMS suggests that while speed increases, the complexity of maintaining such rapid builds shouldn't be ignored.
Google tried something similar in 2008 with SDCH but killed it in 2017. The problems were real. Researchers demonstrated compression side-channel attacks like BREACH, where attackers extract secrets by measuring compressed response sizes. SDCH also violated the Same-Origin Policy and couldn't work with CORS. The modern standard, RFC 9842, enforces same-origin dictionary use and closes those design gaps. Chrome and Edge have shipped support. Firefox is working on it.
But the security concerns haven't vanished.
BREACH works regardless of TLS version and can extract secrets in under a minute with a few thousand requests. Extreme compression ratios of 97 to 99.5% make response sizes an even more precise oracle for attackers, turning every compressed response into a potential data leak. Cloudflare notes that mitigations like Heal-the-BREACH, which randomizes response sizes, or separating secrets from user input remain necessary. The technology is a genuine bandwidth win. But anyone adopting it needs to take the attack surface seriously.