What mature data governance looks like
Because it’s part of Microsoft 365, Copilot can access all of your organisation’s emails,
calendars, files and other data to provide insights to users across your business. As a result,
you’ll need that data to be standardised, accurate and reliable to get the most out of Copilot.
With mature data governance, you’re assured of working from a consistent and single source of
truth while minimising risk.
One key requirement is having the proper permissions in place. This ensures that different
Copilot users across your organisation have what Microsoft calls “just enough access”. It’s a
zero trust approach that means employees can access only the data they need to do their jobs,
which helps to reduce the risks of oversharing. Fortunately, Microsoft 365 already provides a rich
assortment of tools that support this approach, including Conditional Access Control, Data Loss
Prevention (DLP), Microsoft Entra ID Governance, Microsoft Information Protection, Privileged
Identity Management, Role Based Access Control (RBAC) and SharePoint Advanced
Management.
Another tool, Microsoft Purview Information Protection, enables you to use sensitivity labels to
designate who has access to what types of data when they use Copilot. By labelling data
according to different categories – for example, general, public or confidential – you can ensure
that somebody who should have access to only general data can’t view confidential data when
they use Copilot or Copilot agents.
If users outside of your organisation regularly collaborate with employees, you also need to
consider their access to data when using – for example – shared channels on Teams. By
applying sensitivity labels through Purview, you can control what kind of information can be
shared with outside users in Microsoft 365 collaborative workspaces.
You’ll also need to think about data retention policies. When people use Copilot, their prompts,
responses and citations are stored as “the content of interactions”. This information is encrypted
and not used for Copilot training. Using Microsoft Purview, you can set data retention policies
that meet your compliance requirements and can also specify if and when data is deleted after a
certain length of time.
It's important to note that your Copilot deployment will follow the same compliance standards as
your company’s existing Microsoft 365 implementation. So, depending on where your Copilot
users are located, the data they access and generate might fall under various data security,
privacy and sovereignty regulations.
Finally, remember that Microsoft 365 Copilot, like other generative AI tools, can at times produce results that aren’t 100% factual. People in your organisation should be aware of this risk and understand how to prevent unwanted outcomes when using Copilot. The best strategy here? Make it a policy to always manually review results to verify that they are correct and meet your organisation’s requirements around accuracy, copyright, data safety and other areas.