M365

444 bookmarks
Custom sorting
Protect Your Company Data BEFORE Copilot Takes Over!
Protect Your Company Data BEFORE Copilot Takes Over!
In this video, we dive into the essential steps for securing your company data in SharePoint before implementing Copilot. We’ll discuss the risks of overshar...
·youtube.com·
Protect Your Company Data BEFORE Copilot Takes Over!
Microsoft Purview protections for Copilot
Microsoft Purview protections for Copilot
Use Microsoft Purview and Microsoft 365 Copilot together to build a secure, enterprise-ready foundation for generative AI. Apply existing data protection and compliance controls, gain visibility into AI usage, and reduce risk from oversharing or insider threats. Classify, restrict, and monitor sensitive data used in Copilot interactions. Investigate risky behavior, enforce dynamic policies, and block inappropriate use — all from within your Microsoft 365 environment. Erica Toelle, Microsoft Purview Senior Product Manager, shares how to implement these controls and proactively manage data risks in Copilot deployments. Control what content can be referenced in generated responses. Check out Microsoft 365 Copilot security and privacy basics. Uncover risky or sensitive interactions. Use DSPM for AI to get a unified view of Copilot usage and security posture across your org. Block access to sensitive resources. See how to configure Conditional Access using Microsoft Entra. Watch our video here. QUICK LINKS: 00:00 — Microsoft Purview controls for Microsoft 365 Copilot 00:32 — Copilot security and privacy basics 01:47 — Built-in activity logging 02:24 — Discover and Prevent Data Loss with DSPM for AI 04:18 — Protect sensitive data in AI interactions 05:08 — Insider Risk Management 05:12 — Monitor and act on inappropriate AI use 07:14 — Wrap up Link References Check out https://aka.ms/M365CopilotwithPurview Watch our show on oversharing at https://aka.ms/OversharingMechanics Unfamiliar with Microsoft Mechanics? As Microsoft’s official video series for IT, you can watch and share valuable content and demos of current and upcoming tech from the people who build it at Microsoft. Subscribe to our YouTube: https://www.youtube.com/c/MicrosoftMechanicsSeries Talk with other IT Pros, join us on the Microsoft Tech Community: https://techcommunity.microsoft.com/t5/microsoft-mechanics-blog/bg-p/MicrosoftMechanicsBlog Watch or listen from anywhere, subscribe to our podcast: https://microsoftmechanics.libsyn.com/podcast Keep getting this insider knowledge, join us on social: Follow us on Twitter: https://twitter.com/MSFTMechanics Share knowledge on LinkedIn: https://www.linkedin.com/company/microsoft-mechanics/ Enjoy us on Instagram: https://www.instagram.com/msftmechanics/ Loosen up with us on TikTok: https://www.tiktok.com/@msftmechanics Video Transcript: -Not all generative AI is created equal. In fact, if data security or privacy-related concerns are holding your organization back, today I’ll show you how the combination of Microsoft 365 Copilot and the data security controls in Microsoft Purview provide an enterprise-ready platform for GenAI in your organization. This way, GenAI is seamlessly integrated into your workflow across familiar apps and experiences, all backed by unmatched data security and visibility to minimize data risk and prevent data loss. First, let’s level set on a few Copilot security and privacy basics. Whether you’re using the free Copilot Chat that’s included with Microsoft 365 or have a Microsoft 365 Copilot license, they both honor your existing access permissions to work information in SharePoint and OneDrive, your Teams meetings and your email, meaning generated AI responses can only be based on information that you have access to. -Importantly, after you submit a prompt, Copilot will retrieve relevant index data to generate a response. The data only stays within your Microsoft 365 service trust boundary and doesn’t move out of it. Even when the data is presented to the large language models to generate a response, information is kept separate to the model, and is not used to train it. This is in contrast to consumer apps, especially the free ones, which are often designed to collect training data. As users upload files into them or paste content into their prompts, including sensitive data, the data is now duplicated and stored in a location outside of your Microsoft 365 service trust boundary, removing any file access controls or classifications you’ve applied in the process, placing your data at greater risk. -And beyond being stored there for indexing or reasoning, it can be used to retrain the underlying model. Next, adding to the foundational protections of Microsoft 365 Copilot, Microsoft Purview has activity logging built in and helps you to discover and protect sensitive data where you get visibility into current and potential risks, such as the use of unprotected sensitive data in Copilot interactions, classify and secure data where information protection helps you to automatically classify, and apply sensitivity labels to data, ensuring it remains protected even when it’s used with Copilot, and detect and mitigate insider risks where you can be alerted to employee activities with Copilot that pose a risk to your data, and much more. -Over the next few minutes, I’ll focus on Purview capabilities to get ahead of and prevent data loss and insider risks. We’ll start in Data Security Posture Management or DSPM for AI for short. DSPM for AI is the one place to get a rich and prioritized bird’s eye view on how Copilot is being used inside your organization and discover corresponding risks, along with recommendations to improve your data security posture that you can implement right from the solution. Importantly, this is where you’ll find detailed dashboards for Microsoft 365 Copilot usage, including agents. -Then in Activity Explorer, we make it easy to see recent activities with AI interactions that include sensitive information types, like credit cards, ID numbers or bank accounts. And you can drill into each activity to see details, as well as the prompt and response text generated. One tip here, if you are seeing a lot of sensitive information exposed, it points to an information oversharing issue where people have access to more information than necessary to do their job. If you find yourself in this situation, I recommend you also check out our recent show on the topic at aka.ms/OversharingMechanics where I dive into the specific things you should do to assess your Microsoft 365 environment for potential oversharing risks to ensure the right people can access the right information when using Copilot. -Ultimately, DSPM for AI gives you the visibility you need to establish a data security baseline for Copilot usage in your organization, and helps you put in place preventative measures right away. In fact, without leaving DSPM for AI on the recommendations page, you’ll find the policies we advise everyone to use to improve data security, such as this one for detecting potentially risky interactions using insider risk management and other recommendations, like this one to detect potentially unethical behavior using communication compliance policies and more. From there, you can dive in to Microsoft Purview’s best-in-class solutions for more granular insights, and to configure specific policies and protections. -I’ll start with information protection. You can manage data security controls with Microsoft 365 Copilot in scope with the information protection policies, and the sensitivity labels that you have in use today. In fact, by default, any Copilot response using content with sensitivity labels will automatically inherit the highest priority label for the referenced content. And using data loss prevention policies, you can prevent Copilot from processing any content that has a specific sensitivity label applied. This way, even if users have access to those files, Copilot will effectively ignore this content as it retrieves relevant information from Microsoft Graph used to generate responses. Insider risk management helps you to catch data risk based on trending activities of people on your network using established user risk indicators and thresholds, and then uses policies to prevent accidental or intentional data misuse as they interact with Copilot where you can easily create policies based on quick policy templates, like this one looking for high-risk data leak patterns from insiders. -By default, this quick policy will scope all users in groups with a defined triggering event of data exfiltration, along with activity indicators, including external sharing, bulk downloads, label downgrades, and label removal in addition to other activities that indicate a high risk of data theft. And it doesn’t stop there. As individuals perform more risky activities, those can add up to elevate that user’s risk level. Here, instead of manually adjusting data security policies, using Adaptive Protection controls, you can also limit Copilot use depending on a user’s dynamic risk level, for example, when a user exceeds your defined risk condition thresholds to reach an elevated risk level, as you can see here. -Using Conditional Access policies in Microsoft Entra, in this case based on authentication context, as well as the condition for insider risk that you set in Microsoft Purview, you can choose to block their permission when attempting to access sites with a specific sensitivity label. That way, even if a user is granted access to a SharePoint site resource by an owner, their access will be blocked by the Conditional Access policy you set. Again, this is important because Copilot honors the user’s existing permissions to work with information. This way, Copilot will not return information that they do not have access to. -Next, Communication Compliance is a related insider risk solution that can act on potentially inappropriate Copilot interactions. In fact, there are specific policy options for Microsoft 365 Copilot interactions in communication compliance where you can flag jailbreak or prompt injection attempts using Prompt Shields classifiers. Communication compliance can be set to alert reviewers of that activity so they can easily discover policy matches and take corresponding actions. For example, if
·techcommunity.microsoft.com·
Microsoft Purview protections for Copilot
Getting started with the new Purview Content Search
Getting started with the new Purview Content Search
“I’m looking to get started with the new Content Search experience in Purview. Where do I get started?” Welcome to the exciting new world of Content Search!...
·techcommunity.microsoft.com·
Getting started with the new Purview Content Search
Purview Priority Cleanup Sounds Scary
Purview Priority Cleanup Sounds Scary
The idea is scary for eDiscovery folks, but the scary part might be how hard it is to accomplish!
Once again, we see the challenge with large environments: Different stakeholders have different requirements. This is Microsoft attempting to balance that need between retention, eDiscovery, and security.
·mcbridem365.substack.com·
Purview Priority Cleanup Sounds Scary
Researcher agent in Microsoft 365 Copilot
Researcher agent in Microsoft 365 Copilot
Gaurav Anand, CVP, Microsoft 365 Engineering   Recent advancements in reasoning models are transforming chain-of-thought based iterative reasoning,...
·techcommunity.microsoft.com·
Researcher agent in Microsoft 365 Copilot
Another day, another Purview solution: Data Security Investigations
Another day, another Purview solution: Data Security Investigations
You know that saying "another day, another dollar"? Pretty sure when the songwriters came up with that, they weren’t talking about handing over your hard-earned cash to big tech. Yet, here we are, watching Microsoft introduce another premium pay-as-you-go product that will probably have you paying a little more.Data Security Investigations - AI-powered deep content analysis: useful or just another expense?Look, I’m a firm believer that organisations should be putting their money into preventativ
·welkasworld.com·
Another day, another Purview solution: Data Security Investigations
Microsoft 365 subscribers can now experience Copilot in OneDrive
Microsoft 365 subscribers can now experience Copilot in OneDrive
Recently, we added new Copilot value to our Microsoft 365 Personal and Family subscriptions. We are excited to announce our rollout of new Copilot features...
·techcommunity.microsoft.com·
Microsoft 365 subscribers can now experience Copilot in OneDrive
The Risk behind Planner - Lack of Data Retention
The Risk behind Planner - Lack of Data Retention
You can bring your SharePoint Agents experience to Microsoft Teams and share it with other users in the chat experience.
·office365atwork.com·
The Risk behind Planner - Lack of Data Retention
Facilitator Agent Brings AI Notes to Teams Chat
Facilitator Agent Brings AI Notes to Teams Chat
The Facilitator agent can make sense of the messages posted to a Teams chat and summarize the discussion and extract to-do items and unanswered questions.
·office365itpros.com·
Facilitator Agent Brings AI Notes to Teams Chat