Free Microsoft PL-400 Practice Test Questions MCQs

Stop wondering if you're ready. Our Microsoft PL-400 practice test is designed to identify your exact knowledge gaps. Validate your skills with Microsoft Power Platform Developer questions that mirror the real exam's format and difficulty. Build a personalized study plan based on your free PL-400 exam questions mcqs performance, focusing your effort where it matters most.

Targeted practice like this helps candidates feel significantly more prepared for Microsoft Power Platform Developer exam day.

22020+ already prepared
Updated On : 3-Mar-2026
202 Questions
Microsoft Power Platform Developer
4.9/5.0

Page 1 out of 21 Pages

Topic 1: Bellows Sports

   

Case study
This is a case study. Case studies are not timed separately. You can use as much
exam time as you would like to complete each case. However, there may be additional
case studies and sections on this exam. You must manage your time to ensure that you
are able to complete all questions included on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information
that is provided in the case study. Case studies might contain exhibits and other resources
that provide more information about the scenario that is described in the case study. Each
question is independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review
your answers and to make changes before you move to the next section of the exam. After
you begin a new section, you cannot return to this section.
To start the case study
To display the first question in this case study, click the Next button. Use the buttons in the
left pane to explore the content of the case study before you answer the questions. Clicking
these buttons displays information such as business requirements, existing environment,
and problem statements. If the case study has an All Information tab, note that the
information displayed is identical to the information displayed on the subsequent tabs.
When you are ready to answer a question, click the Question button to return to the
question.
Background
Bellows Sports is the region’s newest, largest, and most complete sports complex. The
company features baseball and soccer fields and two full-size hockey rinks. The complex
provides coaching, recreational leagues, a pro shop, and state-of-the art customer and
player amenities.
The company is organized into the following divisions:
Baseball
Hockey
Soccer
Bellow Sports runs tournaments several times per year. Each tournament runs six weeks.
Current environment
Requirements
Bellow Sports tracks players and events in Microsoft Excel workbooks and uses email to
communicate with players, partners, and prospective customers. The company uses a
proprietary cloud-based accounting system.
The company relies on referrals from athletes for new business. Bellows uses a third-party
marketing company to gather feedback and referrals from athletes. The third-party
marketing company uploads a Microsoft Excel file containing lists of potential customers
and players to the FTP site that Bellows Sports maintains.
Tournaments
Customer information is stored in the Accounts entity. Each tournament record must list the
associated sales representative as the tournament owner. When team members create
tournament records they must enter the start date for a tournament. The end date of the
tournament must be automatically calculated.
Registration form
You must create a form to allow players to register for tournaments. The registration form
must meet the following requirements:


A customer wants to design a complex business process flow that includes six custom entities and four stages for each entity. One of the stages will have 15 steps. You need to explain the flaw in this design to the customer.

What is the flaw in this design?

A. The maximum number of custom entities has been exceeded.

B. The maximum number of steps for a stage has been exceeded.

C. The maximum number of stages for an entity has been exceeded.

D. The minimum number of stages for an entity has not been met.

E. The minimum number of steps for a stage has not been met.

A.   The maximum number of custom entities has been exceeded.

Explanation:
This question assesses your understanding of the technical boundaries within Power Automate Business Process Flows (BPFs). The scenario describes a design incorporating six custom entities, four stages per entity, and a stage with 15 steps. Your task is to identify which element breaches the predefined platform limits established by Microsoft. Recognizing these constraints is essential for designing viable solutions. The flaw is not related to the number of steps or stages per entity, but rather the total number of distinct entities a single BPF can utilize.

Correct Option:

A. The maximum number of custom entities has been exceeded.
A single business process flow can only span up to five different tables in the out-of-the-box designer. This represents a hard limit enforced by the platform architecture. The customer's design includes six custom entities, which directly violates this constraint. If the process requires more than five tables, it would need to be broken into multiple BPFs, or a different orchestration method such as a Power Automate cloud flow would need to be considered to manage the process across the sixth entity.

Incorrect Options:

B. The maximum number of steps for a stage has been exceeded.
This option is incorrect because a single stage within a Business Process Flow can contain up to 30 data steps. The customer's design includes a stage with 15 steps, which falls comfortably within this 30-step limit. Therefore, this does not represent a flaw in the design.

C. The maximum number of stages for an entity has been exceeded.
This option is incorrect as there is no documented limit on the number of stages a Business Process Flow can have for a single entity. The design specifies four stages for each entity, which is a perfectly acceptable and common practice in BPF implementations.

D. The minimum number of stages for an entity has not been met.
This option is incorrect because there is no minimum number of stages required for an entity within a Business Process Flow. An entity could theoretically have just one stage or even be part of a flow without any stages specific to it, though that would be illogical from a process perspective.

E. The minimum number of steps for a stage has not been met.
This option is incorrect as there is no minimum number of steps required for a stage within a Business Process Flow. A stage can be designed with zero data steps and simply represent a logical milestone in the process. The design's problem is not about having too few steps.

Reference:
Business process flows overview, Business process flows important information - Microsoft Learn

A training company implements a Common Data Service (CDS) environment. The company has created and stores information about courses in a custom entity.

A Power Automate flow must be created whether a course has been created that starts within the next seven days and must be accurate to the minute.

You need to define an expression that meets the requirements.

Which functions should you use for the expression? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.




Explanation:
This question tests your ability to construct complex expressions in Power Automate for date comparisons. The requirement is to check if a course start date (from a custom entity) falls within the next seven days, with minute-level accuracy. This requires converting dates to a comparable numeric format (ticks), calculating the future threshold (getFutureTime), formatting the comparison value (formatDateTime), accessing the course date field (triggerBody), and performing the comparison (less).

Function Breakdown:

triggerBody?['new_coursedate'] - Accesses the course date field from the trigger output

formatDateTime - Standardizes the course date into a consistent ISO format

ticks - Converts dates into a numeric tick value (100-nanosecond intervals) for comparison

getFutureTime(7, 'Day') - Calculates the date and time exactly 7 days from now

less - Compares if the course date ticks are less than (earlier than) the future threshold ticks

Why this works:
Ticks provide precise numeric values enabling minute-level accuracy

The expression evaluates to true if the course date is within the next 7 days

All functions work together to handle the date comparison requirement

Reference:
Use expressions in conditions in Power Automate, Date and time functions in Power Automate - Microsoft Learn

An organization has a Dynamics 365 Sales environment. In the development environment, you create a business rule named BusinessRule1 on the Account entity. You deploy BusinessRule1 to production as part of a managed solution.

You need to remove BusinessRule1 from the production environment.

Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.




Explanation:
This question tests your understanding of managed solution lifecycle management in Power Platform. When a component is deployed as part of a managed solution, it becomes managed in the target environment and cannot be edited or deleted directly. To remove such a component, you must delete it from the original unmanaged solution in the development environment, then export and import an updated managed solution. This process performs an in-place upgrade that removes the deleted component from production.

Action Breakdown:
In the development environment, navigate to Solutions: You must first access the original unmanaged solution where BusinessRule1 was created.

Select the solution that has BusinessRule1, navigate to the appropriate entity, and delete the rule: Delete the business rule from the unmanaged solution in development.

Export the solution as managed and import it in the production environment: Create a new managed version of the solution and import it over the existing managed solution. This removes the deleted component from production.

Why other actions are incorrect:

Create a new managed solution in the production environment: You cannot create managed solutions directly in production; managed solutions are created by exporting from development.

In the production environment, add a new business rule: This does not remove the existing rule.

Select the solution that has BusinessRule1 and deactivate the rule: Deactivating the rule does not remove it permanently; it only disables its functionality.

Reference:
Delete components from a managed solution, Managed solution lifecycle management - Microsoft Learn

You create solutions in a development environment and export the solution for testing by various departments in your organization. Power users in each department control the testing environments.

You must display department-specific wording at the beginning of any custom notifications that are displayed in testing environments.

You need to package solutions to ensure that the power users can customize the notification content.

Which three actions should you perform in sequence inside a solution? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.




Explanation:
This question assesses your knowledge of solution packaging techniques that enable configuration without code changes. Environment variables are the standard mechanism for storing parameter-like values that can differ across environments. By creating an environment variable, setting a default value, and having your notification logic read from that variable at runtime, power users can simply update the variable value in their testing environment to customize the notification text without modifying the solution itself.

Action Breakdown:
Create an empty environment variable named Custom Text Placeholder: This establishes the configurable parameter within the solution.

Set the default value of the text field Custom Text Placeholder to Enter custom text: Providing a default ensures the notification has meaningful content even if the power user hasn't customized it yet.

Create a function to retrieve the value from the custom text placeholder and display the notification: This implements the runtime logic that reads the current environment variable value and uses it in the notification.

Why other actions are incorrect:

Create a configuration page in the classic solution by using a text field that uses the HTML file format: This is outdated and unnecessary; environment variables provide a cleaner configuration experience.

Export the solution: Exporting happens after the solution is properly configured, not as a step within the solution creation process.

Create a solution component configuration named Custom Text Placeholder that uses the JSON file format: This is not a standard solution component type for this scenario; environment variables handle this need directly.

Reference:
Environment variables in solutions, Update environment variables - Microsoft Learn

An organization uses Common Data Service.

The organization’s IT helpdesk requires a single-page web application to monitor and manage Data Export Service. The app must access Data Export Service securely. The app must also permit helpdesk users to perform a limited set of functions.

You need to create a single-page app.

Which options should you use? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.




Explanation:
This question tests your knowledge of the Data Export Service (DES) in Common Data Service (now Dataverse). DES replicates data to an Azure SQL Database. To build a secure monitoring app, you need proper authentication (Azure AD app registration), and you must understand the DES API endpoints and requirements. Change tracking is required for entities to participate in data replication, and Profile operations are used to monitor both status and sync failures.

Detailed Breakdown:

Connect to the app securely: Register the app in Azure Active Directory
To securely access Data Export Service from a single-page application, you must register the application in Azure AD

This enables OAuth 2.0 authentication and allows you to assign appropriate permissions

Security roles like Environment Maker or CDS user roles are for user-based access within the platform, not for application authentication

Monitor the status of data replication: Use Profile operations
Data Export Service exposes REST API endpoints organized around profile operations

Profile operations allow you to check the status, health, and details of your export profiles

These operations provide information about whether replication is running, paused, or encountering issues

Enable an entity for replication: Enable Change Tracking
Change Tracking must be enabled on an entity for Data Export Service to replicate it

This feature allows Dataverse to track which records have been created, updated, or deleted since the last export

Without Change Tracking, DES cannot determine which changes to replicate to Azure SQL

Start or stop data replication: /crm/exporter/profiles/{id}/activatedata
This specific REST API endpoint activates or deactivates a data export profile

When called with the appropriate profile ID, it starts or stops the replication process

The other endpoints are for metadata retrieval, validation, or testing, not for controlling replication state

View information on records that fail to sync: Use Profile operations
Profile operations include endpoints that return error details and failed record information

DES maintains logs of records that could not be replicated due to data issues or conflicts

These operations allow you to programmatically retrieve and display sync failure information

Reference:
Data Export Service overview, Configure Data Export Service, Data Export Service REST API - Microsoft Learn

You are creating technical designs for several complex business processes.

You need to implement custom business logic based on the requirements.

Which implementation methods should you use? To answer, drag the appropriate implementation methods to the correct requirements. Each implementation method may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.

NOTE: Each correct selection is worth one point.




Explanation:
This question tests your knowledge of the appropriate tools for implementing different types of business logic in Dataverse. The requirement to access both current and new values during an update operation is a key scenario that distinguishes plug-ins from other implementation methods. Plug-ins are event handlers that execute in the Dataverse server context and have full access to the pre-operation and post-operation images containing the old and new values.

Why Plug-in is correct:
Plug-ins execute synchronously or asynchronously during Dataverse data operations (Create, Update, Delete)

They provide access to pre-images (current values before update) and post-images (new values after update) through the plugin execution context

This capability is essential for scenarios requiring validation, auditing, or complex calculations based on value changes

Plug-ins run on the server side, ensuring reliability and consistency regardless of how the update was initiated

Why other methods are incorrect:
Business rule: Business rules run on the client side and cannot access both current and new values during server-side update operations. They are limited to simple form-level logic.

JavaScript code: Client-side JavaScript only has access to values currently on the form. It cannot reliably access both the previous saved values and the new values during a server update operation, especially if the update comes from other sources.

Power Automate flow: While flows can access some context about updates, they have limitations in accessing pre-images (old values) compared to plug-ins. Flows are better suited for processes that don't require tight coupling with the transaction.

Reference:
Write a plug-in, Event execution pipeline, Pre-operation and post-operation images - Microsoft Learn

You are implementing custom business logic in a Power Apps portal.

You need to use Liquid templates to display dynamic content.

To which three entities can you include Liquid code? Each correct answer presents a complete solution.

NOTE: Each correct selection is worth one point.

A. Content snippet

B. Web page

C. Web template

D. Page template

E. Portal settings

B.   Web page
C.   Web template
D.   Page template

Explanation:
This question evaluates your understanding of where Liquid code can be embedded within a Power Apps portal. Liquid is an open-source template language used in portals to render dynamic content from Dataverse. While Liquid can be used in various places, certain entities are specifically designed to contain and execute Liquid code. Understanding these locations is essential for implementing dynamic content strategies in portal development.

Correct Options:

B. Web page:
Web pages in Power Apps portals directly support Liquid code within their copy content. When you edit a web page, you can embed Liquid tags and objects in the content area, and the portal will render them dynamically when the page is viewed.

C. Web template:
Web templates are specifically designed to contain the source code for portal templates, including Liquid, HTML, CSS, and JavaScript. They are the primary location for creating reusable Liquid-based components and layouts.

D. Page template:
Page templates define how web pages are rendered and can contain Liquid code. When a web page uses a page template, the Liquid code in that template executes to generate the page output.

Incorrect Options:

A. Content snippet:
Content snippets are designed for storing small pieces of reusable content like text, HTML, or links. While they can contain HTML that might include Liquid, they are not entities where you primarily write and execute Liquid code. Content snippets are typically referenced by Liquid code rather than containing it.

E. Portal settings:
Portal settings store configuration key-value pairs for portal behavior and appearance. They are meant for simple settings like site name, theme colors, or feature flags, not for containing executable Liquid code. Liquid code can read portal settings but should not be written within them.

Reference:
Work with Liquid templates in Power Apps portals, Understand Liquid operators and types, Web templates in Power Apps portals - Microsoft Learn

A JavaScript function on a Contact form alerts users to what they need to type, as shown in the JavaScript Code exhibit. (Click the JavaScript Code tab.)




Handler Properties Analysis:
Library: cyb_/MsgOnField.js

Function: JsNameSpace.tools.showMsgOnField

Parameters: "telephone1", "mobilephone" (passed to the function)

Pass execution context as first parameter: Yes (checked)

What we know:
The OnChange event of the Business Phone field triggers this function

Two string parameters are passed: "telephone1" and "mobilephone"

The execution context is also passed as the first parameter

The function likely displays messages in notification areas

Without seeing the actual function code, I cannot definitively answer which statements are true. However, based on common patterns in PL-400 exam questions:

If the function displays the first parameter ("telephone1") in the field notification area:

The message "telephone1" shows in the Business Phone notification area → Yes

The message "mobilephone" shows in the Business Phone notification area → No

The message "telephone1" shows in the form notification area → No

The message "mobilephone" shows in the form notification area → No

If the function displays both parameters in different locations:

This would depend on the specific implementation.

If you can provide the JavaScript code from the exhibit, I can give you the exact answers with proper explanations.

Note: This question is part of a series of questions that present the same scenario.
Each question in the series contains a unique solution that might meet the stated
goals. Some question sets might have more than once correct solution, while others
might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As aresult, these questions will not appear in the review screen.
You are developing a model-driven app for a company.
When you create a new Account record, you must automatically display a form to collect
data that is needed to create a Contact record. The form must switch to the appropriate
form layout based on the contact type.
You open the Contact form by using JavaScript. You pass the contact type information to
the form by using the Xrm.Navigation.openForm function. An OnLoad event handler in the
Contact form processes the data and shows only the appropriate sections of the form for
the given contact type.
You need to configure the receiving form to accept the data parameter.
Solution: In the form editor, add a query string parameter for the data parameter.
Does the solution meet the goal?

A. Yes

B. No

A.   Yes

Explanation:
This question tests your understanding of passing parameters between forms in model-driven apps using JavaScript. The scenario requires passing contact type information from the Account form to the Contact form, and the Contact form needs to be configured to receive and process this parameter. The solution involves adding a query string parameter in the form editor to accept the data parameter.

Correct Option:

A. Yes
The solution meets the goal because when using Xrm.Navigation.openForm to open a form with custom parameters, the receiving form must be configured to accept those parameters through query string parameters in the form properties. This configuration tells the system that the form is designed to receive external data. Once configured, the OnLoad event handler can access these parameters using getQueryStringParameters() and use the contact type information to show the appropriate sections.

Why the solution works:
Query string parameters in form properties define which external parameters the form can accept

This configuration enables the form to receive the contact type data passed from the Account form

The OnLoad event handler can then access these parameters and conditionally show sections

This is the standard approach for passing custom data between forms in model-driven apps

Reference:
Open form client API reference, Form properties - Parameters tab - Microsoft Learn

You are researching integrations with several external systems.
Each integration has different requirements.
You need to determine which data sources to use to meet each requirement.
What should you use? To answer, drag the appropriate data sources to the correct
requirements. Each data source may be used once, more than one, or not at all. You may
need to drag the split bar between panes or scro
ll to view content.
NOTE: Each correct selection is worth one point.




Explanation:
This question tests your understanding of integration options in Dataverse (formerly Common Data Service). You need to select the appropriate data source based on specific integration requirements. The two options are Virtual Entity and Custom Connector, which serve different purposes. Virtual entities allow you to display external data within Dataverse without replication, while custom connectors enable Power Automate and Power Apps to interact with external APIs.

Correct Selection:
Virtual entity

Explanation of Correct Option:
A virtual entity meets all three requirements listed. Virtual entities can map external data sources where the primary key is an integer, as they support various data types including integers for the unique identifier. They allow both read and update operations when properly configured with a data provider that supports these operations. Most importantly, virtual entities make external data available to all Dataverse clients including model-driven apps, canvas apps, and Power Automate flows, exactly like standard entities, without requiring data replication.

Why Virtual Entity satisfies each requirement:
Support records that use an integer as a primary key: Virtual entities can map to external tables where the primary key is an integer type, unlike standard entities which typically use GUIDs

Ensure that data can be read and updated: Virtual entities support full CRUD operations through a custom data provider that translates Dataverse operations to external API calls

Ensure that data is available to all Common Data Service clients: Once configured, virtual entities appear and behave like native entities, accessible through all Dataverse interfaces and APIs

Why Custom Connector is incorrect:
Custom connectors are used in Power Automate and Power Apps to connect to external APIs, but they do not create data sources that appear within Dataverse itself. They cannot make external data available to all Dataverse clients as native entities, nor do they support seamless integration with model-driven app forms and views without additional development.

Reference:
Create and edit virtual entities, Virtual entity data providers, Custom connectors overview - Microsoft Learn

Page 1 out of 21 Pages

Microsoft Power Platform Developer Practice Exam Questions