InstaVM has released CodeRunner, an open-source sandbox that runs AI coding agents inside VM-isolated containers on Apple Silicon Macs. The tool supports Claude Code, Claude Desktop, OpenCode, Gemini CLI, Kiro, and OpenAI Python Agents, and is built on Apple's own open-source container runtime (apple/container), which provides full virtual machine-level isolation per container rather than the namespace and cgroup-based isolation used by conventional container runtimes. The project addresses a concrete operational risk in agentic workflows: autonomous coding agents with broad filesystem and network access can cause data loss or exfiltration, and VM-level isolation provides a hardware-enforced boundary that process-level sandboxing does not.

The integration model is built around the Model Context Protocol. CodeRunner exposes an MCP server at a local network address, allowing any <a href="/news/2026-03-15-localagent-v0-5-0-local-first-rust-mcp-runtime">MCP-compatible agent</a> or IDE to delegate code execution to the sandboxed environment through a standard interface. For Claude Code specifically, InstaVM publishes a dedicated plugin installable through Claude Code's plugin marketplace, which exposes tools including an execute_python_code function backed by a persistent Jupyter kernel — maintaining state across executions within a session — as well as Playwright-based web scraping and a skills system for PDF manipulation, image processing, and Office document handling. OpenCode and Kiro integrate via direct MCP server configuration, while Claude Desktop connects through a local proxy script.

Apple's apple/container runtime, released as open-source software for Apple Silicon, provides hardware-enforced VM isolation by building directly on the Apple Virtualization Framework rather than layering on top of a Linux VM as Docker Desktop does. InstaVM's roadmap lists AWS Firecracker as the intended Linux equivalent. The Apple Silicon requirement limits the current audience to Mac users on M-series hardware.

InstaVM also ships a companion coderunner-ui project, a local-first AI chat workspace that connects to local models via Ollama or remote APIs from OpenAI, Google, and Anthropic, while executing generated code inside the same Apple Container VM. The project appeared as a Show HN submission on Hacker News. The GitHub repository is available at github.com/instavm/coderunner under the Apache 2.0 license.