Free Microsoft SC-200 Practice Test Questions MCQs
Stop wondering if you're ready. Our Microsoft SC-200 practice test is designed to identify your exact knowledge gaps. Validate your skills with Microsoft Security Operations Analyst questions that mirror the real exam's format and difficulty. Build a personalized study plan based on your free SC-200 exam questions mcqs performance, focusing your effort where it matters most.
Targeted practice like this helps candidates feel significantly more prepared for Microsoft Security Operations Analyst exam day.
21560+ already prepared
Updated On : 3-Mar-2026156 Questions
Microsoft Security Operations Analyst
4.9/5.0
Topic 1: Contoso Ltd
You have a Microsoft 365 E5 subscription that contains 200 Windows 10 devices enrolled
in Microsoft Defender for Endpoint.
You need to ensure that users can access the devices by using a remote shell connection
directly from the Microsoft 365 Defender portal. The solution must use the principle of least
privilege.
What should you do in the Microsoft 365 Defender portal? To answer, select the
appropriate options in the answer area.
NOTE: Each correct selection is worth one point.


Explanation:
The requirement is to allow users to access devices via a remote shell connection directly from the Microsoft 365 Defender portal. This feature is called Live Response.
Let's break down why the other options are incorrect and why the selected ones are correct.
Analysis of the Options:
1. To configure Microsoft Defender for Endpoint:
Turn on endpoint detection and response (EDR) in block mode: This setting allows Defender Antivirus to block malicious artifacts even when the primary Endpoint Detection and Response (EDR) component is not active. It is a valuable security feature but is not a prerequisite for Live Response.
✅ Turn on Live Response:
This is the direct and necessary configuration to enable the remote shell functionality. Live Response must be enabled at the tenant level to allow investigators to collect forensic data and run scripts on endpoints remotely.
Turn off Tamper Protection: Tamper Protection is a critical security feature that prevents malicious apps from disabling your antivirus protection. Turning it off would weaken the security posture and is completely unrelated to enabling Live Response. The principle of least privilege is achieved through device groups and roles, not by disabling core security features.
2. To configure the devices:
Add a network assessment job:
This is a feature for identifying and managing network vulnerabilities. It has no relation to establishing a remote shell session with a device.
✅ Create a device group that contains the devices and set Automation level to Full: This is the correct configuration to apply the principle of least privilege. Device Groups are used to scope permissions. You can create a group containing only the 200 specific devices that require remote shell access.
The Automation level determines what actions the security system can take automatically. For Live Response to function correctly, the device must be set to at least "Semi" to allow approved scripts to run. Setting it to "Full" ensures that all remediation actions (including those initiated via Live Response) are carried out without waiting for user consent. This is necessary for seamless remote shell operation.
Create a device group... and set Automation level to No automated response: This setting prevents any automated remediation, which would block the execution of commands and scripts through the Live Response session, making the remote shell feature ineffective.
Summary of the Solution:
Enable the Feature: You first enable the Live Response capability at the organization level in the Microsoft 365 Defender portal.
Scope and Permit the Action:
You then create a device group containing the specific 200 devices. By setting the Automation level for this group to "Full," you authorize automated remediation actions, which is a prerequisite for the Live Response remote shell to function on those devices. This follows the principle of least privilege by only granting these elevated permissions to the specific devices that need it, rather than the entire organization.
Reference
Microsoft Learn:
Configure Live response capabilities in Microsoft Defender for Endpoint
This documentation covers the steps to enable Live Response and configure device groups for it.
You have an Azure subscription.
You plan to implement an Microsoft Sentinel workspace. You anticipate that you will ingest
20 GB of security log data per day.
You need to configure storage for the workspace. The solution must meet the following
requirements:
• Minimize costs for daily ingested data.
• Maximize the data retention period without incurring extra costs.
What should you do for each requirement? To answer, select the appropriate options in the
answer area. NOTE Each correct selection is worth one point.


Explanation:
The solution must balance cost-effectiveness for daily data ingestion with the longest possible free data retention period.
Analysis for "Minimize costs for daily ingested data":
This requirement is about choosing the most cost-effective pricing model for ingesting a predictable amount of data (20 GB/day).
Apply a daily cap:
A daily cap is a safety feature that stops data ingestion once a specified limit is reached to prevent unexpected costs. It does not reduce the cost per gigabyte ingested; it only prevents exceeding a budget due to a spike in data volume. Therefore, it does not "minimize costs" for the planned 20 GB/day.
✅ Use a commitment tier:
A commitment tier (formerly called Capacity Reservation) is a pricing model where you commit to a certain amount of data ingestion per day for a discounted price per GB. Since you have a predictable, consistent volume of 20 GB/day, a commitment tier is the most cost-effective option. You would purchase a tier that covers your expected usage.
Use the Pay-As-You-Go (PAYG) model:
The PAYG model charges a standard, higher rate per GB with no commitment. This is more flexible for unpredictable workloads but is more expensive for a known, consistent data volume like 20 GB/day.
Conclusion:
To minimize cost for a predictable daily ingestion, a commitment tier is the correct choice.
Analysis for "Maximize the data retention period without incurring extra costs":
In Microsoft Sentinel, which is built on a Log Analytics workspace, data retention has a free component and a paid component.
Set retention to 31 days: This is the default retention period. You can set it longer.
✅ Set retention to 90 days:
Microsoft provides 90 days of retention at no extra cost for all data in a Log Analytics workspace. This is the maximum retention period you can get without incurring additional charges.
Set retention to 365 days:
Retaining data for 365 days is possible, but it incurs an additional cost for the extra 275 days (365 - 90). This violates the "without incurring extra costs" requirement.
Conclusion:
The maximum retention period included in the base cost of your Log Analytics workspace is 90 days.
Reference:
Microsoft Learn: Azure Sentinel Pricing Details
This page explains the commitment tier pricing model and the included 90-day retention. https://azure.microsoft.com/en-us/pricing/details/azure-sentinel/ Microsoft Learn: Manage usage and costs with Azure Monitor Logs - Retention This documentation explicitly states: "Your workspace has 90 days of data retention at no charge."
A company wants to analyze by using Microsoft 365 Apps.
You need to describe the connected experiences the company can use.
Which connected experiences should you describe? To answer, drag the appropriate
connected experiences to the correct description. Each connected experience may be used
once, more than once, or not at all. You may need to drag the split between panes or scroll
to view content.
NOTE: Each correct selection is worth one point.


Explanation:
The connected experiences described are features within Microsoft 365 Apps, specifically Microsoft Word. Here is a breakdown of why each description matches the connected experience.
1. Provides advanced grammar and style refinements...
✅ Connected Experience: Editor
Explanation: Microsoft Editor is an AI-powered service that goes beyond basic spell check. It provides sophisticated writing assistance with advanced grammar corrections, clarity suggestions, conciseness recommendations, and refinements for formality and vocabulary. This description is the core function of the "Editor" connected experience.
2. Allows you to use and repurpose existing content from relevant files most often used by coworkers.
✅ Connected Experience: Similarity checker
Explanation: This describes the "Rewrite with Source" or content repurposing aspect of the Similarity Report. The Similarity checker, powered by Microsoft Search and organizational content, can not only find similar text but also suggest ways to reuse and properly attribute content from documents within your organization that your coworkers frequently use. This promotes knowledge sharing while maintaining originality.
3. Identifies how much content in a document is original and inserts citations when necessary.
✅ Connected Experience: Similarity checker
Explanation: This is the primary function of the Similarity checker (often integrated with tools like Researcher in Word). It scans the document against web sources and an organization's internal content to identify text that may not be original. It then generates a similarity score (the percentage of original content) and provides tools to help the user insert proper citations for the sourced material.
Reference:
Microsoft Learn:
Overview of connected experiences in Microsoft 365 Apps
This page provides a high-level overview of the different categories of connected experiences.
Microsoft Support:
Check document similarity with your organization's content
This support article directly explains the functionality described in points 2 and 3.
You have a Microsoft Sentinel workspace that contains an Azure AD data connector.
You need to associate a bookmark with an Azure AD-related incident.
What should you do? To answer, drag the appropriate blades to the correct tasks. Each
blade may be used once, more than once, or not at all. You may need to drag the split bar
between panes or scroll to view content
NOTE: Each correct selection is worth one point.


Explanation:
The process of creating and associating a bookmark in Microsoft Sentinel involves two distinct areas of the portal, each with a specific purpose.
Analysis of the Blades:
Hunting blade: This is the dedicated area for proactive threat hunting. It provides access to live stream data and allows you to run queries. When you find interesting results from a query during a hunting session, you create a bookmark directly from this interface to save your query and its results for further investigation.
Incident blade:
This is the central console for managing security incidents. It provides a consolidated view of all related alerts, entities, and evidence for an investigation. From this blade, you can manage the components of an incident, including associating existing bookmarks that are relevant to the case.
Logs blade:
This is a general-purpose query interface for Log Analytics. While you can run hunting queries here, the primary functions for creating and managing bookmarks and incidents are housed within their dedicated blades. It is not the primary blade for these specific administrative tasks.
Step-by-Step Workflow:
Create a Bookmark (using the Hunting blade):
A Security Operations Analyst would go to the Hunting blade in Microsoft Sentinel.
They would run a KQL query to hunt for potential threats in the Azure AD data.
If the query results reveal a suspicious event or entity (e.g., a user account performing anomalous actions), the analyst would select the results and choose to create a new bookmark. This action is performed within the Hunting blade.
Associate the Bookmark with an Incident (using the Incident blade):
The analyst then navigates to the Incidents blade to view the specific Azure AD-related incident they are investigating.
They open the incident to see its details.
Within the incident's page, there is an option to associate existing bookmarks. The analyst would select the bookmark they just created from the list and associate it with the incident. This action of linking evidence to a case is performed within the Incident blade.
Reference
Microsoft Learn: Hunt for threats with Microsoft Sentinel
This documentation covers the process of creating bookmarks during a hunting activity.
Microsoft Learn: Investigate incidents with Microsoft Sentinel
This page explains the incident overview and how to work with related entities and evidence, including bookmarks.
You have a Microsoft Sentinel workspace.
You have a query named Query1 as shown in the following exhibit.

You plan to create a custom parser named Parser 1. You need to use Query1 in Parser1.
What should you do first?
A.
Remove line 2.
B.
In line 4. remove the TimeGenerated predicate.
C.
Remove line 5.
D.
In line 3, replace the 'contains operator with the !has operator.
Remove line 5.
Explanation:
When you create a custom parser in Microsoft Sentinel, the query you use must end with a tabular expression statement — not with commands like sort, limit, or take.
These commands are considered presentation operators, and parsers are designed to produce reusable, structured data tables — not formatted or ordered result sets.
In this query:
OfficeActivity
| where TimeGenerated > ago(7h)
| where Operation !contains "delete"
| project TimeGenerated, UserId, Operation, OfficeWorkload, RecordType, _ResourceId
| sort by TimeGenerated desc nulls last
The last line (sort by TimeGenerated desc nulls last) is a presentation operator, not allowed in parsers.
So, to make this query valid for a custom parser, you must remove line 5.
🚫 Why Other Options Are Incorrect:
A. Remove line 2
Removing the filter for TimeGenerated would make the query inefficient and not impact parser compatibility. It’s valid syntax.
B. In line 4, remove the TimeGenerated predicate
The project operator is fine. It defines which columns the parser outputs — required for structured results.
D. In line 3, replace 'contains' with '!has'
The operator !contains is valid KQL syntax; changing it to !has doesn’t fix the parser issue.
📘 Reference:
Microsoft Learn: Create and use custom parsers in Microsoft Sentinel
KQL documentation: Sort operator (Kusto Query Language)
You need to use an Azure Resource Manager template to create a workflow automation
that will trigger an automatic remediation when specific security alerts are received by
Azure Security Center.
How should you complete the portion of the template that will provision the required Azure
resources? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.


This ARM template is provisioning a Workflow Automation in Microsoft Defender for Cloud (formerly Azure Security Center). This resource type is designed to automatically trigger a Logic App when specific security alerts or recommendations are generated.
Let's break down each placeholder:
1. "type": "_ /automations"
Correct Value: Microsoft.Security
2. resourceId('ITEM2/workflows' ...)
Correct Value: Microsoft.Logic
Explanation: The resourceId() function is used to get the unique identifier of an existing Azure resource. The ITEM2 placeholder represents the resource provider of the resource we are referencing. We are referencing a Logic App workflow. The resource provider for Azure Logic Apps is Microsoft.Logic. The full path for a Logic App workflow is Microsoft.Logic/workflows.
3. resourceId(... ' /workflows/triggers' ...)
Correct Value: Microsoft.Logic
Explanation: This is a continuation of the same resourceId function call. We are now getting the resource ID for the trigger of the Logic App. Triggers are a sub-resource of a Logic App workflow. Since the parent resource is a Logic App (provider: Microsoft.Logic), and we are accessing its triggers sub-resource, the provider remains Microsoft.Logic. The full path being constructed here is for the manual trigger of the Logic App, which is needed to get its callback URL (uri).
Summary
The template is doing the following:
Creating a new Microsoft.Security/automations resource (the Workflow Automation itself).
Configuring this automation to trigger an action of type LogicApp.
Referencing an existing Logic App (provider: Microsoft.Logic/workflows) and its trigger to get the URL needed to execute it.
Reference:
Microsoft Learn: Microsoft.Security automation template reference
This is the official ARM template reference for the Microsoft.Security/automations resource, which shows the correct type.
Microsoft Learn: Deploy a Workflow Automation template
This tutorial provides examples of ARM templates for Workflow Automations, clearly showing the use of Microsoft.Security and Microsoft.Logic.
You have resources in Azure and Google cloud.
You need to ingest Google Cloud Platform (GCP) data into Azure Defender.
In which order should you perform the actions? To answer, move all actions from the list of
actions to the answer area and arrange them in the correct order.


This sequence follows the logical flow of setting up a cross-cloud connection: first, you prepare the source system (GCP), then you establish the connection in the target system (Azure), and finally, you enable the specific service.
1.Enable the GCP Security Command Center API.
Reasoning: Before you can use any GCP service programmatically or configure its integration, you must first enable its API. This is a fundamental first step in Google Cloud. Without the Security Command Center API being enabled, none of the subsequent steps can function.
2.Configure the GCP Security Command Center.
Reasoning:Once the API is active, you need to configure the Security Command Center itself. This involves enabling it for your GCP organization or specific projects and turning on the security services (like Security Health Analytics and Web Security Scanner) whose findings you want to export to Azure.
3.Create a dedicated service account and a private key.
Reasoning: Azure Defender needs secure, programmatic access to your GCP environment to pull the security data. In GCP, this is done by creating a service account with the necessary permissions (e.g., Security Center Admin Viewer) and generating a private key (JSON file). This key is the credential that Azure will use to authenticate.
4.From Azure Security Center, add cloud connectors.
Reasoning: Now that GCP is prepared, you move to the Azure side. In Microsoft Defender for Cloud, you use the "Cloud connectors" page to add a new GCP connector. This is where you provide the GCP organization details and upload the private key JSON file you created in the previous step to establish the trust.
5.Enable Security Health Analytics.
Reasoning: This is the final configuration step. "Security Health Analytics" is the plan in Microsoft Defender for Cloud that corresponds to GCP's CSPM (Cloud Security Posture Management) offerings. Once the connector is established, you can enable this plan to start ingesting GCP security recommendations and compliance data. This step activates the data ingestion and analysis for your GCP resources.
Reference:
Microsoft Learn: Connect your GCP accounts to Microsoft Defender for Cloud
This official documentation outlines the process and confirms the order, starting with GCP setup (SCC, service account) and moving to Azure configuration (connector, plan).
You have a Microsoft 365 subscription that uses Microsoft 365 Defender and contains a
user named User1.
You are notified that the account of User1 is compromised.
You need to review the alerts triggered on the devices to which User1 signed in.
How should you complete the query? To answer, select the appropriate options in the
answer area.
NOTE: Each correct selection is worth one point.


You use Azure Sentinel to monitor irregular Azure activity.
You create custom analytics rules to detect threats as shown in the following exhibit.


The rule is designed to trigger per resource creation/update event.
If one user deploys three VMs, each deployment generates a separate log entry.
The analytics rule will match each entry individually, resulting in three alerts.
2. If three separate users deploy one Azure virtual machine each within five minutes of each other, how many alerts will you receive?
✅ Answer: 3 alerts Explanation:
The rule does not aggregate by user or time window.
Each VM deployment by each user is a distinct event.
Therefore, the rule will trigger once per event, regardless of timing or user identity.
📚 Reference:
Create scheduled analytics rules in Microsoft Sentinel
You have an Azure subscription that has Azure Defender enabled for all supported
resource types.
You create an Azure logic app named LA1.
You plan to use LA1 to automatically remediate security risks detected in Defenders for
Cloud.
You need to test LA1 in Defender for Cloud.
What should you do? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.


The goal is to test a Logic App (LA1) designed for automatic remediation from within Defender for Cloud. The key is understanding the workflow automation feature and how it integrates with Logic Apps.
Analysis of the Options:
1. Set the LA1 trigger to:
This refers to the trigger you set up inside the Logic App itself in the Logic App Designer.
When a Defender for Cloud Recommendation is created or triggered: This is for automating responses to security posture recommendations (CSPM). While valid for some automations, it's not the primary method for testing a remediation tied to an active threat.
✅ When a response to a Defender for Cloud alert is triggered:
This is the correct and specific trigger for this scenario. Defender for Cloud's Workflow Automation feature uses a webhook to call the Logic App. The "When a response to a Defender for Cloud alert is triggered" trigger in Logic Apps is a pre-built connector that listens for this exact webhook call. It is designed to receive the alert context and execute the remediation steps.
2. Trigger the execution of LA1 from:
This refers to where you go within the Defender for Cloud portal to manually initiate the test.
Regulatory compliance standards / Recommendations: These sections are for managing your cloud security posture (CSPM), not for actively triggering a response to a security alert.
✅ Security alerts:
This is the correct location. Defender for Cloud provides a "Trigger logic app" button directly on the security alert details page. This allows a security analyst to manually run the associated Logic App (like LA1) to test the remediation workflow against a real or simulated alert without waiting for it to run automatically.
Summary of the Testing Process:
Build the Logic App (LA1): In the Logic App designer, you set the trigger to "When a response to a Defender for Cloud alert is triggered".
Create a Workflow Automation:
In Defender for Cloud, you create a Workflow Automation resource that is linked to LA1 and set to trigger on specific alert types.
Test the Integration:
You navigate to the Security alerts page in Defender for Cloud, select a relevant alert, and use the "Trigger logic app" button to manually execute LA1 and verify it works as expected.
Reference:
Microsoft Learn: Automate responses to Microsoft Defender for Cloud triggers
This documentation explains the process, including the specific Logic App trigger and how to test it from the security alerts page.
| Page 1 out of 16 Pages |
Microsoft Security Operations Analyst Practice Exam Questions
My SC-200 Success Story: Conquering the Microsoft Security Operations Analyst Exam on the First Try
The Preparation Challenge
As a security professional aiming to validate my skills, the SC-200 Microsoft Security Operations Analyst exam seemed daunting. The broad syllabus, covering everything from threat mitigation to Microsoft 365 Defender and Microsoft Sentinel, required a strategic study plan. I knew theoretical knowledge alone would not suffice.
Discovering the Key Resource
My research led me to MSmcqs.com, which became the cornerstone of my preparation. Their comprehensive Microsoft Security Operations Analyst practice test perfectly mirrored the exams style and difficulty. Each SC-200 question was a learning opportunity, complete with detailed explanations that clarified complex concepts.
Crucial Exam Insights
The exam rigorously tests your ability to:
Investigate Threats: Using Azure Sentinel, Microsoft Defender, and Microsoft 365 Defender.
Mitigate Attacks: Implementing incident response and remediation actions.
Configure Security Tools: Managing data connectors, analytics rules, and automation in Sentinel.
Practicing with MSmcqs.com transformed my understanding. I did not just memorize answers; I learned to analyze scenarios, identify the correct security tools, and understand the "why" behind each step in the security operations process.
The Triumphant Result
On exam day, I felt confident and prepared. The practice had ingrained the required workflows and product-specific knowledge. I passed on my first attempt! The realistic practice was undeniably the main reason for my success. It bridged the gap between theory and practical application, turning a challenging goal into a achievable milestone. I highly recommend it to any aspiring Security Operations Analyst.