Juan Pablo AJ figured out how to make Claude, Codex, and Gemini collaborate without paying a cent in API fees. The trick is simple. Instead of integrating through APIs, you let one agent invoke another through CLI commands that preserve conversation context. Commands like "codex exec resume --last" and "gemini -r latest -p" let agents pick up where they left off, creating a loop where one agent produces work, another reviews it, and they iterate until the output stabilizes. AJ stores the conventions in Claude memory files so agents can read and reuse the invocation rules consistently.

Two patterns emerge from his approach. The first is a bare-bones non-interactive method using resume commands, which trades visibility for simplicity. The second uses tmux to run agents in separate panes so you can watch what each one's doing in real time. The tmux version is better for debugging and parallel execution but needs more setup. Both let you pull perspectives from different model families rather than trusting one vendor's subagents.

Hacker News commenters pointed out that Claude Code already has built-in Agent Teams functionality with similar tmux integration. Someone reverse-engineered that protocol into claude-code-teams-mcp, a standalone MCP server that makes agent team capabilities available to any MCP client. Wes McKinney's RoboRev tool takes a complementary angle, automatically reviewing agent commits via git hooks and maintaining a persistent review queue that agents can address directly.

But there's the Terms of Service problem. OpenAI, Anthropic, and Google all require access through approved channels. Automating through CLI workarounds could violate those terms, raising potential CFAA and DMCA concerns. Account suspension is the most likely consequence.

AJ raises a more fundamental question too. When LLMs talk to each other, they produce polished text easily. But does the final result actually improve, or do you just get a longer chain of confident-sounding output? He's framing this as a workflow worth testing, not a universal solution.