Question of the Day
One question per day to look beyond the headlines.
Why would Microsoft promise Copilot Health won’t train on your medical data yet still require a waitlist?
Take-away A “no training on your data” pledge doesn’t reduce launch risk; the waitlist gates access so Microsoft can validate security/privacy controls at production scale.
Microsoft's Copilot Health promises not to train AI models using individual health data to maintain user privacy and comply with security standards. This aligns with their broader privacy commitments which emphasize user control over data and transparency in handling personal information [1], [3]. The waitlist is likely a strategic measure to manage initial user demand and control the rollout of the service, providing time to ensure the system's robustness and to fine-tune privacy and security measures effectively as they monitor and analyze user engagement under a limited release [1], [2].
- Microsoft launches Copilot Health, a dedicated space for personal health data and AI-driven insights | Fortune fortune.com (opens in new tab)
- Microsoft launches Copilot Health to help consumers understand their medical data - SiliconANGLE siliconangle.com (opens in new tab)
- Your Data, Your Choices: Understanding Microsoft’s Privacy Commitments techcommunity.microsoft.com (opens in new tab)