Microsoft is further tightening security around its AI assistant Copilot. Within Microsoft 365, it will now be possible to keep sensitive Office documents completely out of Copilot’s reach, whether they are stored locally or in the cloud.
This eliminates an important protection difference that previously existed between local storage and SharePoint or OneDrive, according to BleepingComputer.
Until now, data loss prevention within Microsoft Purview only worked for documents in Microsoft’s cloud services. Files stored on laptops or desktops were outside that scope. In practice, this meant Copilot could analyze locally stored documents, even when organizations had strict security rules in place. Microsoft is now putting an end to that limitation.
The change will be implemented between the end of March and the end of April 2026 via an update to the Office architecture. Once this rollout is complete, DLP rules will apply to all Word, Excel, and PowerPoint files, regardless of their storage location. According to Microsoft, this step aligns better with the expectations of organizations that want to apply their security policies consistently, without distinguishing between local and cloud files.
Automatic blocking of sensitive documents
When a document is classified as sensitive or restricted by the security policy, Copilot cannot view or process it. Organizations that have already set up policies to block AI processing of labeled content will automatically benefit from the extension. No additional configuration or management actions are required.
Technically, nothing changes in the functionality of Copilot itself. Microsoft has implemented the change in Office clients and in the underlying components that read security labels. Where Copilot previously relied on cloud connections to determine whether a document was protected, the label can now be retrieved directly from the application. This allows security policies to be enforced even on files that have never been stored in SharePoint or OneDrive.
The timing of the announcement is noteworthy. Microsoft recently encountered an error in Copilot Chat. This error allowed the AI assistant to analyze and summarize confidential emails for several weeks, even though they were protected by active DLP rules. The messages were located in the Sent Items and Drafts folders, which are normally inaccessible for automated processing.
According to Microsoft, the impact was limited to users who already had rights to view the emails in question. Nevertheless, the company acknowledged that the situation did not align with Copilot’s intended security model, which requires sensitive and protected information to be explicitly excluded from AI access.
By extending DLP to all storage locations, Microsoft is taking a step toward further reducing such risks. For organizations that use Copilot within Microsoft 365, this means greater control over what the AI assistant can and cannot access, without changing their existing security policies.