Free Microsoft PL-500 Practice Test Questions MCQs

Stop wondering if you're ready. Our Microsoft PL-500 practice test is designed to identify your exact knowledge gaps. Validate your skills with Microsoft Power Automate RPA Developer questions that mirror the real exam's format and difficulty. Build a personalized study plan based on your free PL-500 exam questions mcqs performance, focusing your effort where it matters most.

Targeted practice like this helps candidates feel significantly more prepared for Microsoft Power Automate RPA Developer exam day.

2540+ already prepared
Updated On : 3-Mar-2026
54 Questions
Microsoft Power Automate RPA Developer
4.9/5.0

Page 1 out of 6 Pages

Topic 1, Contoso Pharmaceuticals

   

Background
Contoso Pharmaceuticals distributes specialty pharmaceuticals, ingredients, and raw materials throughout North America. The company has 33 offices and 12 warehouses across the US, Mexico, and Canada. As their customers' needs grow in sophistication, Contoso wants to delight customers with breakthrough products, exceptional service, and on-time delivery of materials. They want to automate time consuming and manual processes that are prone to error. Contoso wants to consolidate and automate ordering and fulfillment processes.

• The company has a fleet of 500 delivery trucks. The company has 150 drivers and uses third-party contractors to deliver goods.

• The company has 400 warehouse workers and 30 finance clerks.

• Contoso has 85 sales representatives and 50 customer service representatives. Sales representatives spend most of their time on the road visiting customers or prospects.

• The IT department consists of four system administrators and six system analysts.

Current environment

Overview
Contoso Pharmaceuticals has a custom enterprise resource management (ERP) system. It is difficult to integrate other applications and services with the system. Office staff manually key in purchase orders, customer orders, and invoices after they receive a scan or hard copy of an agreement.

Applications

• The company uses a custom supplier management system named SMSApps that runs on each user's workstation. The system is costly to run and maintain. SMSApp does not have an API.

• Sales representatives manage customer requests by using Dynamics 365 Sales.

• Contoso has Microsoft Power Platform development, user acceptance testing (UAT), and production environments.

• Administrators create one Accounts Payable (AP) mailbox for each environment to support testing.

• The use of a DLP policy and Desktop Flow development is specified as part of the automation requirements.

Business process
1. Sales representatives create quotes by using a Microsoft Word document template. The template allows representatives to include product quantity, and cost estimation details that will be needed to fulfil an order.

The representative converts quotes to a PDF file and emails the file to the customer for approval.

2. The sales representative alerts the finance team about the new order and emails the finance team a copy of the quote for processing.

3. The finance team prints the quote and manually creates a purchase order (PO) into SMSApp to request materials from a known and trusted vendor.

4. The SMSApp distributes the PO to stakeholders. The system sends a copy to a shared finance team mailbox.

5. Once a PO is fulfilled by a vendor, the system sends an email to the finance mailbox. The finance team releases an order to the warehouse.

6. Materials are shipped from the vendor to one of Contoso's warehouses. Warehouse workers enter key information from the waybill into SMSApp. The materials are unloaded and racked in the warehouse until they are shipped to customers.

7. Upon checking for new daily orders in SMSApp, they see an open order is pending that is awaiting the newly received materials

8. The Warehouse worker loads an order onto a truck for delivery and marks the order as complete in SMSApp.

9. Sales representatives provide fulfillment status and tracking information for orders.

10. A finance clerk prepares an invoice and sends the invoice to the customer by email. The clerk sends a copy of the email to the shared AP mailbox.

11. The AP team monitors the shared mailbox to confirm that the customer has paid the invoice.

Requirements
Functional requirements

• Large volume orders must be processed before other orders.

• Invoices must be cross-checked with received items against packing slip for shipments.

• The finance team must be able to analyze patterns in transactional data to conduct fraud prevention activities.

• You must automate the process of entering data about incoming orders into SMSApp.

• The solution must follow the principle of least privilege.

Purchase Order Quantity flow

• You must create an unmanaged solution to update purchase order details in SMSApp. The flow must use a manual trigger.

• Members of Accounts Payable team will be testers for the solution. They must be able to access the Purchase Order Quantity flow.

Flow for processing invoice data

• You must create a flow to monitor the AP mailbox. When an invoice arrives as an attachment in the inbox, the flow must automatically process the invoice data by using a form processing model. The flow must crosscheck the received items against the packing slip.

• You must use different Accounts Payable email addresses for development user acceptance testing (UAT),and production environments.

• You must use an environment variable to represent the Accounts Payable mailbox for the environment in use.

• You must be able to use the environment variable across multiple cloud flows, a custom connector, and a canvas app.

Technical requirements
• Users must only be allowed to connect to and access systems that are required for the employee to perform required job tasks.

• All automation flows must be either co-owned or shared between staff.

• All employees must be able to access the new environment to build personal productivity automations.

• You must distribute the workload for desktop flows to optimize productivity.

Monitor flows
• All data extracted from Invoices should be stored in a custom Dataverse entity. Only employees who are part of Finance role should be able to edit all Invoice data but must be prevented from creating or deleting one.

Issues
Invoice data

All users report that they can see and modify invoice data.

New environment

• The IT department creates a new environment. A user creates a cloud flow named FlowA in the environment that triggers a desktop flow. A user reports that the cloud flow does not trigger the desktop flow to run.

• Microsoft Dataverse is not provisioned in the new environment. You attempt to create a Desktop flow in the default environment but receive a Dataverse error message and cannot proceed.

Data entry automation flow
An administrator runs a new desktop flow in the development environment to automate data entry into SMSApp. The flow automatically reverts to a suspended state. Order fulfillment flow

You must automate the customer communication process by using an unattended desktop flow. The flow must check the fulfillment status of each active order in SMSApp. If an order is fulfilled, the flow must send the customer an email that includes tracking information for their order.

You develop a desktop flow.

You need to debug the flow.

Which three tools can you use? Each correct answer presents a complete solution.

NOTE: Each correct selection is worth one point.

A. Static results

B. Run from here

C. Breakpoints

D. Configure run after

E. Run next action

B.   Run from here
C.   Breakpoints
E.   Run next action

Explanation:
When developing and troubleshooting a desktop flow in Power Automate, specific debugging tools allow you to control the execution flow to inspect variables and logic. The question asks for tools used specifically to debug, meaning actively controlling or pausing the run. Options that control the flow step-by-step or from a specific point are valid debugging tools.

Correct Option:

B. Run from here:
This tool allows you to start the execution of the flow from a specific action, skipping all previous actions. It is highly useful for testing a specific segment of the flow without having to run the entire process from the beginning, saving significant development time.

C. Breakpoints:
By setting a breakpoint on an action, the flow execution will pause immediately before that action is run. This allows developers to inspect the current values of variables and the state of the desktop at that exact moment, which is fundamental to identifying logic errors.

E. Run next action:
This tool executes only the next single action in the flow and then pauses again. It is used in conjunction with breakpoints to step through the flow line-by-line, verifying the outcome of each individual action before proceeding to the next.

Incorrect Option:

A. Static results:
This feature is used to mock the output of an action by manually defining the result values. While this is useful for testing how a flow behaves with specific data inputs, it does not actually debug or step through the execution of the flow logic itself; it bypasses the action entirely.

D. Configure run after:
This setting is used to define the dependencies and conditions for an action to execute based on the success or failure of previous actions. It is a crucial part of error handling and flow logic design, but it is a configuration setting rather than an interactive tool used to debug a live run.

Reference:
Microsoft Learn: Debug desktop flows

You are creating a cloud flow that will use two Update Row actions to interact with Microsoft Dataverse. Neither of these actions are dependent on each other.

You must minimize the amount of processing time required to complete the flow.

You need to implement the actions in the cloud flow.

Solution: Create a switch condition.

Does the solution meet the goal?

A. Yes

B. No

B.   No

Explanation:
The goal is to minimize processing time for two independent Update Row actions in Dataverse. A switch condition is used for conditional branching based on the value of a variable or expression, allowing different paths of execution. However, since the actions are independent with no dependencies between them, the performance bottleneck would be waiting for one action to complete before the next begins.

Correct Option:

B. No:
The switch condition does not help minimize processing time for independent actions. To reduce processing time for non-dependent actions, you should use parallel branches or a "Apply to each" loop with concurrency control. These approaches allow multiple Dataverse operations to execute simultaneously, whereas a switch condition forces sequential execution through a single selected path.

Incorrect Option:

A. Yes:
This is incorrect because a switch condition executes only one branch of logic based on a condition. It would still run the Update Row actions sequentially if multiple are within the same branch. The switch condition adds conditional logic but does nothing to improve performance through parallel execution, which is what the scenario requires for minimizing processing time.

Reference:
Microsoft Learn: Configure parallel branches in cloud flows

You plan to use a cloud flow.

The flow must be contained within a solution.

You need to add the cloud flow to a solution.

Solution: Add an existing cloud flow from a managed solution to a new unmanaged solution.

Does the solution meet the goal?

A. Yes

B. No

A.   Yes

Explanation:
The goal is to have a cloud flow contained within a solution. In Power Platform, solutions are used for application lifecycle management (ALM) and can be either managed or unmanaged. Managed solutions are typically used for deployment to other environments, while unmanaged solutions are used for development. Adding a flow from a managed solution to an unmanaged solution effectively brings that flow into a development context.

Correct Option:

A. Yes:
When you add an existing cloud flow from a managed solution to a new unmanaged solution, you are successfully containing the flow within a solution. The flow becomes part of the unmanaged solution, allowing you to develop, customize, and later export it. This is a standard practice for migrating components from managed to unmanaged solutions for further development work.

Incorrect Option:

B. No:
This is incorrect because adding a flow from a managed solution to an unmanaged solution does contain the flow within a solution. While managed solutions are typically deployed and unmanaged solutions are for development, the flow is still contained within a solution structure. The action meets the basic requirement of having the flow in a solution container.

Reference:
Microsoft Learn: Add solution components

You have a flow that interacts with different SharePoint sites. You add the flow to a solution.

You redeploy the solution to production each time you make a change to the flow. You do not want to change the SharePoint site URL every time you redeploy the solution.

You need to configure the solution. Which solution component should you use?

A. Web resource

B. Managed identity

C. Connection reference

C.   Connection reference

Explanation:
When redeploying solutions across environments, connection references provide abstraction for connector connections. They separate connection details like SharePoint site URLs from the flow logic, allowing flows to use different connections in different environments without modification. This is essential for ALM practices where development, test, and production environments have different SharePoint site URLs.

Correct Option:
C. Connection reference:
A connection reference stores information about a connector connection that can be reused across multiple flows and solutions. When you export and import a solution containing a flow with connection references, you can map to different connections in the target environment. This allows your SharePoint flows to automatically use the correct site URL in production without manual updates, solving the redeployment problem.

Incorrect Option:

A. Web resource:
Web resources are files used in model-driven apps, such as HTML, JavaScript, or CSS files. They are not designed to store connection information for SharePoint sites. Web resources are client-side components for UI customization and cannot abstract connector connection details across environment redeployments.

B. Managed identity:
Managed identity is an Azure Active Directory feature that provides an automatically managed identity for applications to connect to resources. While useful for authentication scenarios, it does not abstract SharePoint site URLs or connection details. Connection references specifically handle the connection abstraction needed for cross-environment redeployment.

Reference:
Microsoft Learn: Connection references

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You are creating a cloud flow that will use two Update Row actions to interact with Microsoft Dataverse. Neither of these actions are dependent on each other.

You must minimize the amount of processing time require to complete the flow.

You need to implement the actions in the cloud flow.

Solution: Create a parallel branch for the two Update Row actions.

Does the solution meet the goal?

A. Yes

B. No

A.   Yes

Explanation:
When two actions in a flow are not dependent on each other (i.e., they can run independently), running them sequentially means the total execution time is the sum of each action's duration. By placing them in parallel branches, they can execute at the same time, reducing the overall flow processing time significantly. This is a standard performance optimization in Power Automate for independent operations.

Correct Option:

A. Yes.
Creating a parallel branch (using a Parallel or "Run in parallel" block) allows the two Update Row actions to be initiated simultaneously. The flow's processing time will then be roughly equal to the duration of the longer of the two actions, rather than the sum of both, thereby minimizing the total time required.

Reference:
Microsoft Learn documentation on flow performance and parallel execution. Using parallel branches for independent actions is a recommended practice to reduce total runtime and improve efficiency, as it leverages concurrent processing.

You are creating a custom connector to support invoice automation. You connect a Power Automate. How to the custom connector and successfully authenticate?

When you test the flow, you observe that several actions are missing from the custom connector.

You need to update the custom connector settings.

What should you do?

A. Set the action visibility option to None.

B. Change the connection name.

C. Change the parameter drop-down type to Static.

D. Add an action description value.

E. Set the action visibility option to Internal.

A.   Set the action visibility option to None.

Explanation:
When actions from a custom connector are not appearing in the Power Automate designer for a connected flow, it's often due to the visibility setting of those actions. In the custom connector definition, each action has a "Visibility" property (e.g., Important, Advanced, None, Internal). If set to "Internal", the action is hidden from the flow designer UI. To make the missing actions available, you need to change this visibility to a value like "None" (default) or "Important".

Correct Option:

A. Set the action visibility option to None.
Changing the visibility from Internal (or another restrictive setting) to None makes the action visible in the connector's action list within the Power Automate flow designer, allowing users to select and use it.

Incorrect Option:

B. Change the connection name.
The connection name is an alias for the authenticated instance and does not affect the availability of actions from the underlying connector definition.

C. Change the parameter drop-down type to Static.
This setting affects how a parameter's value list is presented (static vs. dynamic) but does not control whether the entire action is visible or hidden.

D. Add an action description value.
While a description is helpful for documentation, its absence does not hide the action from the designer.

E. Set the action visibility option to Internal.
This would do the opposite—it would hide the action, making the problem worse. "Internal" visibility is used for actions that should only be callable from within other actions of the same connector, not exposed directly in the UI.

Reference:
Microsoft Learn, "Custom connector visibility" explains that the Visibility property of an operation determines if it appears in the flow designer. Internal hides it, while None, Important, or Advanced make it visible. To fix missing actions, ensure visibility is not set to Internal.

You have an automation solution that uses a desktop flow. The flow reads data from a file that is stored on UserA’s machine and writes data to an application. You import the solution to an environment that is connected to UserA’s machine.

UseerB report that the flow fails. An alert indicated that the path to thee the file does not exist. You conform that the file is present on userB’s desktop.

You need to resolve the issue.

What should UserB Do?

A. Delete and recreate the file.

B. Change access rights for the file to allow read operations for the PAD process

C. Change the location of the file to a specific path that is not dependent on the signed-in user.

D. Modify the action to retry if the process cannot find the file.

C.   Change the location of the file to a specific path that is not dependent on the signed-in user.

Explanation:
The desktop flow uses a hard-coded file path that likely includes a user-specific folder (e.g., C:\Users\UserA\Desktop\file.txt). When the same flow runs on UserB's machine, the path C:\Users\UserA\... does not exist because the username folder is different. The solution is to make the file path independent of the logged-in user by using a universal location accessible to all users, such as a shared network drive or a system folder like C:\ProcessData\.

Correct Option:

C. Change the location of the file to a specific path that is not dependent on the signed-in user.
Move the file to a common location (e.g., \\server\share\ or C:\AutomationFiles\) and update the desktop flow to reference this static, user-agnostic path. This ensures the flow can find the file regardless of which user is signed in.

Incorrect Option:

A. Delete and recreate the file. This does not address the path issue.
The new file will still be created in a user-specific location if the flow's logic doesn't change.

B. Change access rights for the file to allow read operations for the PAD process.
The error is "path does not exist," not "access denied." Permissions are not the primary issue; the fundamental problem is the incorrect path.

D. Modify the action to retry if the process cannot find the file.
Retrying will not resolve a missing file path. The flow will repeatedly fail because the file is not at the location the flow is looking for.

Reference:
Best practices for desktop flows that access files emphasize using environment variables (like %Public%) or absolute network paths to avoid user profile dependencies. The issue is a classic "hard-coded path" problem in automation.

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You are developing a solution for a medical practice. The solution must use an artificial intelligence (AI) model to evaluate medical X-ray images and detect broken bones.

You need to create the AI model for the solution.

Solution: Use AI Builder to create the model.

Does the solution meet the goal?

A. Yes

B. No

B.   No

Explanation:
AI Builder is a low-code service within the Power Platform for creating and using AI models. It offers prebuilt models (for invoices, receipts, etc.) and custom models for form processing, object detection, and classification. The specific task of evaluating medical X-ray images to detect fractures (broken bones) is a specialized image classification or object detection problem. AI Builder's custom vision capabilities (part of its custom model offerings) can be used for this purpose if you train it with labeled X-ray images.

Correct Option:

A. Yes.
AI Builder allows you to create a custom AI model for image classification. You can create a new model, upload and tag a set of X-ray images (e.g., "fracture" and "no fracture"), train the model, and then use it within your Power Automate flow or Power App to evaluate new images. This meets the goal.

Reference:
Microsoft Learn, "Get started with AI Builder custom models" includes examples of creating image classification models. While medical imaging is a specialized domain requiring high-quality, compliant data, the technical capability to train a model for detecting features in images exists within AI Builder's custom model features.

You are developing an RPA solution that requires browser automation. You are testing the flow. You observe that the flow does not interact with web page elements in Microsoft Edge. You need to troubleshoot the issue. What should you do?

A. Enable error handling on the action to retry on failure.

B. Ensure the Ul flows/Selenium extension is downloaded and enabled in Microsoft Edge

C. Open Power Automate machine runtime and select Troubleshoot

D. Ensure the Power Automate for desktop browser extension is downloaded and enabled in Microsoft Edge.

D.   Ensure the Power Automate for desktop browser extension is downloaded and enabled in Microsoft Edge.

Explanation:
For desktop flows to interact with web pages in modern browsers like Microsoft Edge (Chromium), the Power Automate for desktop browser extension must be installed and enabled. This extension facilitates communication between the desktop flow engine and the browser, allowing actions like "Populate text field on web page" to work correctly. If the extension is missing or disabled, the flow will fail to identify and interact with web elements.

Correct Option:

D. Ensure the Power Automate for desktop browser extension is downloaded and enabled in Microsoft Edge.
This is the primary troubleshooting step for web automation failures in Edge. You must verify the extension is installed from the Microsoft Edge Add-ons store and is turned on for the specific site or globally.

Incorrect Option:

A. Enable error handling on the action to retry on failure.
While this might help with transient issues, it does not address the root cause if the browser extension is missing. The flow will still fail on each retry.

B. Ensure the Ul flows/Selenium extension is downloaded and enabled in Microsoft Edge.
The correct extension is specifically named "Power Automate for desktop" (formerly "UI Flows"). While it may use Selenium under the hood, referring to a "Selenium extension" is incorrect in this context.

C. Open Power Automate machine runtime and select Troubleshoot.
The machine runtime client has a "Troubleshoot" option, but it generally checks the machine's connectivity to the Power Automate service and runtime health, not the presence or configuration of specific browser extensions required for web automation.

Reference:
Microsoft Learn, "Set up browser extension for desktop flows" explicitly states that the Power Automate for desktop browser extension is required for web automation in Microsoft Edge and Chrome. If web interactions fail, verifying this extension is the first troubleshooting step.

You must create new flows within a solution and import existing flows into the solution. You need to configure the solution.

Which three actions can you perform? Each correct answer presents a complete solution. NOTE: Each correct selection is worth one point.

A. Create the flows within the solution to automatically create connection references

B. Select connections for connection references when you import solutions into an environment.

C. Add an existing connection reference into the solution in the same environment.

D. Add credential information to each connection reference.

E. Modify each trigger and action when you add a flow into the solution to use connection references instead of connections.

A.   Create the flows within the solution to automatically create connection references
B.   Select connections for connection references when you import solutions into an environment.
E.   Modify each trigger and action when you add a flow into the solution to use connection references instead of connections.

Explanation:
When working with solutions, connection references are used to abstract and manage the connections that flows use. This allows connections to be easily swapped between environments (e.g., from dev to prod). To configure a solution properly with flows, you need to handle how connection references are created, assigned during import, and utilized within the flow definitions.

Correct Option:

A. Create the flows within the solution to automatically create connection references. When you create a new flow directly inside a solution and add a connection-based action, the system can automatically generate a connection reference for that connector and associate it with the flow. This is the recommended practice for new development.

B. Select connections for connection references when you import solutions into an environment.
During the solution import process into a target environment, you are prompted to map each connection reference in the solution to an actual, existing connection in that target environment. This is a key step for deployment.

E. Modify each trigger and action when you add a flow into the solution to use connection references instead of connections.
When you add an existing standalone flow to a solution, the flow originally uses direct connections. You must edit the flow inside the solution and reconfigure each trigger and action to use the connection references that are part of the solution, replacing the old direct connections.

Incorrect Option:

C. Add an existing connection reference into the solution in the same environment.
You cannot directly "add" an existing connection reference object to a solution. Connection references are created either automatically when building flows in the solution or manually within the solution's objects.

D. Add credential information to each connection reference.
Connection references do not store credentials. They are pointers to connection objects, which hold the actual credentials. You manage credentials when creating the connections themselves in the target environment, not within the connection reference in the solution.

Reference:
Microsoft Learn, "Use connection references in solutions" explains that connection references are created when building flows in solutions, must be mapped to connections on import, and that existing flows added to solutions need their connections updated to use the references.

Page 1 out of 6 Pages
12

Microsoft Power Automate RPA Developer Practice Exam Questions