Topic 6: Misc. Questions

Note: This question is part of a series of questions that present the same scenario.
Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an Azure Synapse Analytics dedicated SQL pool that contains a table named Table1.
You have files that are ingested and loaded into an Azure Data Lake Storage Gen2 container named container1.
You plan to insert data from the files into Table1 and transform the data. Each row of data in the files will produce one row in the serving layer of Table1.
You need to ensure that when the source data files are loaded to container1, the DateTime is stored as an additional column in Table1.
Solution: In an Azure Synapse Analytics pipeline, you use a Get Metadata activity that retrieves the DateTime of the files.
Does this meet the goal?

A. Yes

B. No

B.   No

You create five Azure SQL Database instances on the same logical server.
In each database, you create a user for an Azure Active Directory (Azure AD) user named User1.
User1 attempts to connect to the logical server by using Azure Data Studio and receives a login error.
You need to ensure that when User1 connects to the logical server by using Azure Data Studio, User1 can see all the databases.
What should you do?

A. Create User1 in the master database.

B. Assign User1 the db_datareader role for the master database.

C. Assign User1 the db_datareader role for the databases that Userl creates.

D. Grant select on sys.databases to public in the master database.

A.   Create User1 in the master database.

You have an Azure SQL database named DB1. You need to display the estimated execution plan of a query by using the query editor in the Azure portal. What should you do first?

A. Run the set showplan_all Transact-SQL statement.

B. For DB1, set QUERY_CAPTURE_MODE of Query Store to All.

C. Run the set forceplan Transact-SQL statement.

D. Enable Query Store for DB1.

A.   Run the set showplan_all Transact-SQL statement.

You plan to build a structured streaming solution in Azure Databricks. The solution will count new events in fiveminute intervals and report only events that arrive during the interval.The output will be sent to a Delta Lake table. Which output mode should you use?

A. complete

B. append

C. update

A.   complete

You have the following Azure Data Factory pipelines:
Ingest Data from System1
Ingest Data from System2
Populate Dimensions
Populate Facts
Ingest Data from System1 and Ingest Data from System2 have no dependencies. Populate Dimensions must execute after Ingest Data from System1 and Ingest Data from System2.
Populate Facts must execute after the Populate Dimensions pipeline. All the pipelines must execute every eight hours.
What should you do to schedule the pipelines for execution?

A. Add a schedule trigger to all four pipelines.

B. Add an event trigger to all four pipelines.

C. Create a parent pipeline that contains the four pipelines and use an event trigger.

D. Create a parent pipeline that contains the four pipelines and use a schedule trigger

D.   Create a parent pipeline that contains the four pipelines and use a schedule trigger

You are creating a new notebook in Azure Databricks that will support R as the primary language but will also support Scala and SQL.
Which switch should you use to switch between languages?

A. \\ [ < language > ]

B. %< language >

C. \\[< language >]

D. @ < language >

B.   %< language >

You have an on-premises multi-tier application named App1 that includes a web tier, an application tier, and a Microsoft SQL Server tier. All the tiers run on Hyper-V virtual machines. Your new disaster recovery plan requires that all business-critical applications can be recovered to Azure. You need to recommend a solution to fail over the database tier of App1 to Azure. The solution must provide the ability to test failover to Azure without affecting the current environment.
What should you include in the recommendation?

A. Azure Backup

B. Azure Information Protection

C. Windows Server Failover Cluster

D. Azure Site Recovery

D.   Azure Site Recovery

You are developing an application that uses Azure Data Lake Storage Gen 2.
You need to recommend a solution to grant permissions to a specific application for a limited time period. What should you include in the recommendation?

A. role assignments

B. account keys

C. shared access signatures (SAS)

D. Azure Active Directory (Azure AD) identities

C.   shared access signatures (SAS)

Explanation:

A shared access signature (SAS) provides secure delegated access to resources in your storage account. With a SAS, you have granular control over how a client can access your data. For example:
What resources the client may access.

You have an Azure Synapse Analytics dedicated SQL pool named Pool1 and a database named DB1. DB1 contains a fact table named Table.
You need to identify the extent of the data skew in Table1.
What should you do in Synapse Studio?

A. Connect to Pool1 and query sys.dm_pdw_nodes_db_partition_stats.

B. Connect to the built-in pool and run DBCC CHECKALLOC.

C. Connect to Pool1 and run DBCC CHECKALLOC.

D. Connect to the built-in pool and query sys.dm_pdw_nodes_db_partition_stats.

D.   Connect to the built-in pool and query sys.dm_pdw_nodes_db_partition_stats.

You have an Azure SQL Database managed instance.
The instance starts experiencing performance issues.
You need to identify which query is causing the issue and retrieve the execution plan for the query. The solution must minimize administrative effort.
What should you use?

A. the Azure portal

B. Extended Events

C. Query Store

D. dynamic management views

C.   Query Store

Page 10 out of 34 Pages
DP-300 Practice Test Previous

Are You Truly Prepared?

Don't risk your exam fee on uncertainty. Take this definitive practice test to validate your readiness for the Microsoft DP-300 exam.