Microsoft OneDrive's New AI Feature Raises Privacy Concerns

10/12/2025

Microsoft has introduced a new artificial intelligence capability in its OneDrive cloud storage service that automatically identifies faces in uploaded photographs to help users organize images of friends and family. However, this feature, which is enabled by default, comes with a significant caveat: users are limited to turning it off only three times per year. This restriction has sparked considerable debate regarding user autonomy and data privacy, especially given the growing integration of AI across Microsoft's product ecosystem.

New AI Facial Recognition Feature in OneDrive Raises Questions About User Privacy and Control

In a recent development that has garnered attention from technology observers, Microsoft has begun integrating an AI-driven facial recognition tool into its OneDrive cloud storage. This functionality, designed to streamline photo organization by recognizing individuals, was first noticed by a user who uploaded an image from their mobile device. Upon reviewing the privacy settings, the user discovered the \"people section\" feature, which leverages AI to identify faces. What proved to be particularly contentious was the accompanying disclaimer stating, \"You can only turn off this setting three times a year.\"

This policy has ignited discussions across various platforms, including a notable exchange on Slashdot. While some speculate that the limitation might serve a practical purpose—such as managing the computational resources required to delete biometric data in compliance with regulations like GDPR, as suggested by a commenter named AmiMoJo—privacy advocates remain uneasy. The core of their concern lies in the feature being opt-out rather than opt-in, placing the onus on users to disable a service that collects biometric data.

When questioned, Microsoft declined to elaborate on the rationale behind the three-times-a-year restriction, simply stating that \"OneDrive inherits privacy features and settings from Microsoft 365 and SharePoint, where applicable.\" This response has done little to assuage fears, especially for organizations like the Electronic Frontier Foundation. Thorin Klosowski, a security activist with the EFF, emphasized the importance of transparency and user choice, arguing that any privacy-related feature should be opt-in, accompanied by clear documentation detailing the associated risks and benefits.

The introduction of this AI-powered facial recognition feature, coupled with its restrictive opt-out policy, is viewed by many as part of a broader trend where AI increasingly encroaches upon personal privacy. In the context of Microsoft's aggressive push for AI integration—including making Copilot a mandatory component of Microsoft 365 and reportedly pressuring employees to utilize AI tools—this OneDrive update appears to be another data point in a pattern that warrants close scrutiny regarding the evolving balance between technological advancement and individual privacy rights.

The introduction of Microsoft's new facial recognition feature in OneDrive offers a critical moment for reflection on the intersection of artificial intelligence, personal privacy, and corporate responsibility. From a user's perspective, the default activation of biometric data collection, paired with a stringent limit on opting out, feels like a subtle erosion of control over personal information. This raises fundamental questions about what constitutes informed consent in the digital age. Companies rolling out powerful AI capabilities, particularly those involving sensitive data like biometrics, have a responsibility to prioritize user agency. An opt-in model, where users actively choose to enable such features after being fully informed of their implications, would foster greater trust and respect for individual privacy. This situation highlights the ongoing challenge of balancing technological innovation with ethical considerations, urging both consumers and regulators to remain vigilant about how our digital identities are managed and protected.