The Inform Team: Data, trust and Microsoft 365 Copilot with Nikki Chapple

Our MVP, Nikki Chapple attended The Inform Team’s webinar about Data and trust in Microsoft 365 Copilot.

Before you roll out prompts and pilots, your Data Protection Officer is already asking the big questions. Where’s the data going? Who can see what? What does Copilot actually do with sensitive info? This session is your starting point for answers. Straight-talking. No scare tactics.


Full Transcript:

Mark Thompson (00:00:00.00)

Hello, everybody. I’m Mark Thompson from the Informed team, and I’d like to welcome you to our webinar, Data Trust and Microsoft 365 Copilot. As AI becomes more central to our digital workplace, Microsoft Copilot is transforming how we work. It unlocks productivity, creativity, and efficiency across Microsoft 365.

But with this innovation come some important questions: Is our data safe? Are we compliant? What happens if we don’t adopt AI responsibly?

In today’s session, we’re going to talk about trust, governance, and readiness. We’ll look at how Copilot respects your existing Microsoft 365 permissions, how it is governed by enterprise-grade compliance tools, and how you can monitor, manage, and scale it safely across your organization.

But first, let’s meet Nikki.
Nikki Chapple is a Principal Cloud Architect at CloudWay and a dual Microsoft Most Valuable Professional in Microsoft 365 and Security. Nikki has over 35 years of experience bridging IT and business, specializing in helping organizations embed Microsoft 365 data security, governance, and compliance into their digital transformation strategies and projects.

We’re so pleased to have Nikki here. Nikki, is there anything you’d add to that? And how can people get in touch with you?


Nikki Chapple (00:01:25.12)

Yes, thanks, Mark. I suppose I’d add that I’ve always been what I call a translator between IT and business. It’s really important to get the balance right between productivity and security. Too often, security teams simply say “no” and block Copilot or similar tools, and then we risk shadow AI.

My role is to break down complex concepts into business language and articulate business requirements back to IT to get a balanced, risk-based approach. That’s always been important to me.

If people want to get hold of me, I’m on LinkedIn—just search for Nikki Chapple. I also have a blog at nikkichapple.com, where I post a lot about Microsoft 365, Purview, Copilot governance, risk, and compliance. I also co-host a podcast, All Things M365 Compliance, with Ryan John Murphy from Microsoft. You can subscribe on Spotify or watch the video version on YouTube.

Thank you for having me today. It’s a real pleasure to be here.


Mark Thompson (00:02:35.14)

It’s a privilege for us, Nikki. I’ve seen you speak at conferences many times, and you really humanize what is often a complicated topic. We’re grateful you’re here.

Let’s get started. One of the most common concerns we hear is: What if Copilot shows me something I shouldn’t see? That’s a fair question—especially when data sensitivity and access control are critical.

Nikki, can Copilot access or expose data I shouldn’t see?


Nikki Chapple (00:03:10.13)

That’s a good question. Copilot doesn’t introduce any new risk. Copilot can only see data that you already have access to. If you have access to a SharePoint site, a Microsoft Team, or a file someone has shared with you, Copilot can use that same data.

However, we need to unpick that a bit. Many organizations aren’t governing their Microsoft 365 tenants properly—membership reviews, file permissions, and access are often not well-maintained. If you’re in a Team with a million files, for example, you may be able to see files you shouldn’t, and Copilot will surface them because you already have access.

My analogy is that search used to be like finding a needle in a haystack. You had to know which SharePoint site to look in, what the file was called, and be clever with search. With Copilot, it’s like standing outside the haystack with a magnet—all the needles you already have access to will come to you.

For example:
“Show me information about pay-rise decisions for company board members.”
Copilot will surface anything you already have access to across your OneDrive, email, and SharePoint sites.

Another big and often overlooked risk: public Teams and Microsoft 365 Groups. Public Teams allow anyone in the organization to self-join. But even if you don’t join, the underlying SharePoint site usually grants “Everyone except external users,” meaning you already have access without realizing it. Therefore, Copilot has access as well.

My top tip: review your public Teams and Groups. Some may be appropriate (like company-wide social spaces), but if something like HR is set to Public, change it to Private immediately.


Mark Thompson (00:06:47.06)

Nikki, I think a few people might be shuddering right now realizing they may have open Teams with sensitive data. That is such an important point.

This reminds me of when Delve first came out—people blamed Delve for surfacing information, but Delve was only exposing oversharing that already existed. Copilot is similar. For now, Nikki, shall we move on?


Nikki Chapple (00:07:32.04)

Yes, let’s go to the next one.


Mark Thompson (00:07:34.06)

Visibility and protection go hand in hand. How does Copilot ensure that sensitive or regulated data stays protected while we all try to work faster and smarter?


Nikki Chapple (00:07:54.07)

With Microsoft 365 Copilot and the enterprise version of Copilot Chat, you can be confident that your data stays within your tenant, and the underlying models do not learn from your data. That’s the key difference between enterprise tools and consumer tools like ChatGPT.

When you use ChatGPT, your prompts and files may be used to train the public model. There have been real incidents where confidential information was unintentionally exposed this way.

With Copilot, Microsoft puts several guardrails in place:

  • Sensitivity Labels
    • If you’re using sensitivity labels—Internal, Confidential, Highly Confidential—Copilot respects those labels. If Copilot references multiple files with different labels, it applies the highest label.
    • If the label has encryption or access restrictions, those are enforced too. This means if you generate a document based on a Highly Confidential source, the new document will also be labeled Highly Confidential.
  • Blocking Copilot
    • If your organization has content that should never be used with Copilot—like certain research data—you can create a sensitivity label such as “Highly Confidential – Block Copilot.” You then create a DLP rule so that:
      • Copilot Chat refuses to access or summarize that file.
      • The Copilot button inside Word/PowerPoint is disabled.
      • The user can still open the file—but Copilot cannot interact with it.
  • Data Security Posture Management for AI (DSPM for AI)
    • This tool in Microsoft Purview lets you see:
      • who is using Copilot,
      • what data is being accessed,
      • whether any sensitive information types (e.g., credit cards, addresses, credentials) are involved, and
      • usage across both Microsoft Copilot and third-party AI apps.
    • Practical Classification Approach
    • You don’t need to label millions of existing files at once. Start with:
      • new content, and
      • content that is actively edited.
      • Then work gradually through historical data. It’s manageable and far less overwhelming.

Mark Thompson (00:15:39.06)

Great insights. I love the idea of labeling forward and tackling the backlog over time. Let’s move on. What are the risks of not using Copilot? And how does shadow AI come into play?


Nikki Chapple (00:16:57.11)

Last year, Microsoft and LinkedIn reported that 78% of employees using AI are using their own preferred tools, such as ChatGPT, DeepSeek, or Google Gemini. People are turning to these tools because they help them get work done.

But using consumer AI tools can be risky:

  • Anything you enter may be stored or used to train public models.
  • Confidential information can be exposed.
  • Some tools are hosted in jurisdictions organizations may not trust.

We’ve seen real-world cases where companies unintentionally leaked source code or confidential customer information into consumer AI systems.

If your organization delays providing Copilot, users will find alternatives—and the longer they use them, the harder it is to change their habits.

This is not just about technology. It’s about education and change management. Users need to understand what happens to the data they put into consumer AI tools. And organizations need to provide safe, supported alternatives like Copilot.

Companies can monitor or block external AI tools through firewall rules, Defender for Cloud Apps, and DLP—but the key is giving users the right tools and guidance.


Mark Thompson (00:22:00.08)

Exactly. If you don’t adopt Copilot, people will turn to other tools—and the longer you wait, the more they build habits elsewhere.

Let’s talk about transparency. How do we monitor and audit Copilot activity?


Nikki Chapple (00:23:15.06)

All Copilot activity is logged within Microsoft 365. Prompts and responses are captured for security, compliance, and eDiscovery.

Audit log retention:

  • 180 days by default
  • 1 year or more with E5/A5 or additional retention options

Tools you can use:

  • DSPM for AI for detailed insights across Copilot and third-party AI
  • Microsoft 365 Admin Center for usage and adoption reporting
  • Viva Insights, which includes Copilot analytics by department and can combine quantitative data with surveys

As adoption grows, you want to see:

  • Copilot usage increase, and
  • consumer AI usage decrease.

Mark Thompson (00:26:48.05)

Great. Now, many organizations worry their data governance isn’t perfect. Can they still move forward with Copilot?


Nikki Chapple (00:27:25.05)

Yes, you absolutely can. And you should.

Copilot doesn’t create new risks—it reveals existing ones. If you wait for perfect governance, users will continue using shadow AI tools and your risk increases.

You can roll out Copilot in phases while improving governance in parallel.

Key steps you can take immediately:

  • Enable sensitivity labels for new and edited files.
  • Review SharePoint and Teams permissions—especially public Teams.
  • Use SharePoint Advanced Management features like Restricted SharePoint Search or Restricted Content Discovery, which limit what Copilot can surface.
  • Follow the Microsoft Purview Secure-by-Default blueprint.

It’s too late to wait. If you don’t provide the right tools, employees will adopt their own.


Mark Thompson (00:30:04.11)

I love that phrase—It’s too late to wait. With consumer AI so widespread, the demand is already there. We’re nearly out of time. Nikki, any final thoughts before we move to questions?


Nikki Chapple (00:31:41.09)

Yes. If you haven’t started your Copilot journey yet and you’re not blocking third-party AI, then you already have an issue—not just a risk. Users are using other AI tools.

Go into the Data Security Posture Management for AI portal in Purview. If you have E5 licenses, you can also connect endpoints and track third-party AI usage. Seeing that data can be a wake-up call. It helps you demonstrate the real-world risks and justify bringing activity into Copilot where it’s governed and secure.


Mark Thompson (00:31:41.09)

Thank you, Nikki. And thank you to everyone who joined us today.

Recent News
Recent Blogs
Scroll to Top