When Rootly AI Labs shipped On-Call Health, the most telling design decision wasn't the burnout score or the Slack integration. It was that the team built an MCP server into it from the start.

The oncallhealth-mcp package lets AI assistants — Claude, Cursor, Windsurf — query on-call risk data directly through the oncallhealth.ai REST API using only an API key. Version 1.1 dropped the requirement for direct database access or VPN connectivity. In practice that means an AI assistant can surface burnout risk during an incident review or sprint planning session without anyone first navigating to a separate dashboard. The workflow inversion is the point.

The underlying product pulls signals from wherever on-call work actually happens: incident platforms (Rootly, PagerDuty), developer tooling (GitHub, Jira, Linear), and Slack. From those inputs it produces an OCH Score on a 0–100 scale and a Score Trend. Crucially, the benchmarking is individual-baseline rather than team-average — a deliberate choice, since engineers have different natural workload tolerances and comparing against a team mean would mask deterioration in anyone who habitually runs hot.

OpenAI and Anthropic APIs feed a pattern-detection layer that generates natural-language explanations for each score: which combination of after-hours pages, incident volume, and PR activity is driving a given trend. That's where On-Call Health diverges from tools like OpsLevel or Cortex, which track service ownership and developer experience metrics but don't attempt to model individual wellbeing trajectories over time.

The release sits alongside Rootly's broader agent push. The company announced a separate Rootly MCP Server on March 12, 2026, wiring AI assistants into its full incident management platform. On-Call Health is the people layer — not what's failing in production, but who's absorbing the human cost of keeping it running.

Whether engineering teams will route workforce data through it, even self-hosted under Apache 2.0 via Docker Compose, is an open question. But the MCP-native architecture means the path from installation to AI-integrated workflows is unusually short for a tool in this category.