A GitHub tutorial making the rounds shows how to route Claude Code through Ollama, cutting your Anthropic bill by roughly 90%. The idea: keep Claude Pro for the hard problems, send grunt work to free models running locally. Lints, refactors, batch file operations. Gemma, Qwen, and DeepSeek. The repo includes a copy-paste prompt that handles most setup across macOS, Windows, and Linux.

But Hacker News commenters flagged problems. The actual routing software is @musistudio/claude-code-router, built nine months ago by a developer called musistudio. Coherence Daddy's tutorial doesn't credit that original work. And it likely violates Anthropic's TOS, which prohibits modifying or circumventing Claude Code's intended functionality to avoid payment. Intercepting requests meant for Anthropic's infrastructure crosses that line.

Then there's the practical objection. If you're just doing grep-and-replace or basic refactoring, you don't need AI. sed exists. Using a language model for tasks that standard command-line utilities already solve is a weird cost optimization. The real value would be routing complex code generation to cheaper models, but the tutorial doesn't make that distinction.

For teams considering this: routing proprietary code through multiple providers means your data policies need to cover each one. Your enterprise agreement with Anthropic probably doesn't cover whatever a local DeepSeek instance does with your codebase.