Microsoft Copilot is now ingesting personal health data — Apple Health readings, electronic health records, wearable device output — and using it to offer medication reminders, symptom analysis, and personalized medical guidance. A March 2026 New York Times analysis urges consumers to slow down before connecting their records.

The central problem is legal, not just technical. HIPAA governs how hospitals, insurers, and doctors handle patient data. It does not clearly cover consumer-facing AI assistants. Copilot, as a general-purpose tool operating outside the traditional covered-entity framework, may not be bound by the same rules a hospital portal would be. Consumers who assume their health data carries HIPAA protection when fed into <a href="/news/2026-03-14-microsoft-copilot-health-centralizes-personal-medical-records-outside-hipaa">an AI chatbot</a> may be wrong — and Microsoft's data-sharing terms don't resolve the ambiguity.

That gap matters because the data flowing in is serious: diagnoses, prescriptions, lab results, biometrics. Once it moves into a platform governed by a consumer software agreement rather than a healthcare privacy law, the protections are whatever the terms of service say they are.

Accuracy is a separate problem. Copilot, like every large language model deployed today, hallucinates. It misreads context, conflates conditions, and can generate confident-sounding guidance that is simply wrong. In a productivity app, a hallucinated meeting summary is an inconvenience. In a health context, a misread medication interaction is something else.

The NYT piece draws a line from email and calendar integrations to financial account access and now medical records. Each expansion followed the same arc: convenience pitched first, implications examined later. <a href="/news/2026-03-14-microsoft-copilot-health-launches-ehr-wearable-data-health-insights">Copilot's health features</a> are the latest step, and the one with the highest stakes so far.

Congress has not updated HIPAA's consumer AI provisions. The FTC has signaled interest in health data privacy but has not issued rules specific to AI assistants. Until one of those things happens, users connecting their health records to Copilot are operating without a clear safety net — and largely without knowing it.