Microsoft launched Copilot Health in March 2026, an AI-powered feature embedded within its Copilot platform that aggregates personal health data from wearables, lab results, and hospital records into a unified chat interface. The product connects to medical records from over 50,000 US hospitals and healthcare organizations via a platform called HealthEx, pulls lab results through health tech company Function, and integrates wearable data from more than 50 device manufacturers including Apple, Oura, and Fitbit. Users can query their consolidated health history, search for providers by specialty and insurance coverage, and receive appointment reminders — all within a single dashboard. Microsoft's VP of Health at Microsoft AI, Dr. Dominic King, framed the product as a personal health companion meant to complement physicians rather than replace them.

The central controversy is Copilot Health's deliberate position outside HIPAA compliance. Because the product operates as a direct-to-consumer experience in which users voluntarily share their own data, Microsoft does not qualify as a covered entity or business associate under HIPAA, and therefore faces none of the regulatory fines or criminal liability that hospitals and physicians would incur for mishandling the same information. Dr. King acknowledged the gap directly ahead of launch, stating that "HIPAA is not required for a direct-consumer experience like this when you're using your own data," while vaguely promising future announcements about HIPAA-aligned controls. Microsoft currently cites ISO 42001 certification — an international responsible AI standard it also holds for Microsoft 365 Copilot products — as evidence of responsible governance. ISO 42001 carries no legal penalties and places no restrictions on how Microsoft may use or share health data; it is a governance framework, not a regulatory backstop.

Microsoft has made a series of voluntary privacy commitments: health chat data is reportedly isolated from general Copilot sessions, is not used to train AI models, and users can delete their data or disconnect sources at any time. The operative word, however, is voluntary. These commitments carry no regulatory enforcement backstop and can be revised unilaterally through a privacy policy update. The result is an asymmetry: Copilot Health centralizes highly sensitive personal data — cholesterol levels, hospital visit histories, biometric readings — while bearing far lighter legal accountability than any licensed healthcare provider handling the same records. Microsoft has been here before. HealthVault, its previous personal health record platform, ran for 12 years before Microsoft shut it down in 2019, citing low consumer adoption. Dr. King has said further announcements on HIPAA-aligned controls are coming — an implicit acknowledgment that the current compliance posture is a starting point, not a final answer.