Welcome to TestSimulate

Pass Your Next Certification Exam Fast!

Everything you need to prepare, learn & pass your certification exam easily.

365 days free updates. First attempt guaranteed success.

Microsoft Administering Relational Databases on Microsoft Azure (DP-300) Free Practice Test

Question 1
You have an Azure subscription that contains an Azure SQL database. The database contains a table named table1.
You execute the following Transact-SQL statements.

You need to reduce the time it takes to perform analytic queries on the database.
Which configuration should you enable?

Correct Answer: A
Explanation: Only visible for TestSimulate members. You can sign-up / login (it's free).
Question 2
Drag and Drop Question
Your company analyzes images from security cameras and sends alerts to security teams that respond to unusual activity. The solution uses Azure Databricks.
You need to send Apache Spark level events, Spark Structured Streaming metrics, and application metrics to Azure Monitor.
Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions in the answer area and arrange them in the correct order.
Correct Answer:

Explanation:
Send application metrics using Dropwizard.
Spark uses a configurable metrics system based on the Dropwizard Metrics Library.
To send application metrics from Azure Databricks application code to Azure Monitor, follow these steps:
Step 1: Configure your Azure Databricks cluster to use the Databricksmonitoring library.
Prerequisite: Configure your Azure Databricks cluster to use the monitoring library.
Step 2: Build the spark-listeners-loganalytics-1.0-SNAPSHOT.jar JAR file Step 3: Create Dropwizard counters in your application code Create Dropwizard gauges or counters in your application code
Question 3
Hotspot Question
You have an Azure subscription.
You need to deploy an Azure SQL resource that will support cross database queries by using an Azure Resource Manager (ARM) template.
How should you complete the ARM template? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Correct Answer:

Explanation:
Box 1: Microsoft.Sql/managedInstances
There is a need to support cross database queries.
Box 2: parameters('virtualNetworkName')
The Managed Instance depends on the Virtual Network.
{
"type": "Microsoft.Sql/managedInstances",
"apiVersion": "2020-02-02-preview",
"name": "[parameters('managedInstanceName')]",
"location": "[parameters('location')]",
"dependsOn": [
"[resourceId('Microsoft.Network/virtualNetworks', parameters('virtualNetworkName'))]"
]
}
Reference:
https://docs.microsoft.com/en-us/azure/azure-sql/managed-instance/create-template-quickstart?tabs=azure-powershell
Question 4
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have the on-premises networks shown in the following table.

You have an Azure subscription that contains an Azure SQL Database server named SQL1.
SQL1 contains two databases named DB1 and DB2.
You need to configure access to DB1 and DB2. The solution must meet the following requirements:
- Ensure that DB1 can be accessed only by users in Branch1.
- Ensure that DB2 can be accessed only by users in Branch2.
Solution: You connect to DB1 and run the following command.
EXECUTE sp_set_firewall_rule 'Allow db1 users', '131.107.10.0',
'131.107.10.255'
You connect to DB2 and run the following command.
EXECUTE sp_set_database_firewall_rule 'Allow db2 users',
'131.107.11.0', '131.107.11.255'
Does this meet the goal?

Correct Answer: A
Question 5
SIMULATION
You need to encrypt the LastName column of the SalesLT.Customer table in db1 by using Always Encrypted. You must use the Windows Certificate Store.
You may need to use SQL Server Management Studio and the Azure portal.
Correct Answer:
Solution 1 [Less complex]
Configure column encryption using Always Encrypted Wizard
The Always Encrypted Wizard is a powerful tool that allows you to set the desired Always Encrypted configuration for selected database columns. Depending on the current configuration and the desired target configuration, the wizard can encrypt a column, decrypt it (remove encryption), or re-encrypt it (for example, using a new column encryption key or an encryption type that is different from the current type, configured for the column). Multiple columns can be configured in a single run of the wizard.
Step 1: Open the Always Encrypted Wizard
You can launch the wizard at three different levels:
At a database level - if you want to encrypt multiple columns located in different tables.
At a table level - if you want to encrypt multiple columns located in the same table.
*-> At a column level - if you want to encrypt one specific column. [This option here] Step 2: Connect to your SQL Server with the Object Explorer component of SQL Server Management Studio.
Step 3: To encrypt an individual column, navigate to the column, right-click on it, and then select Encrypt Columns. [Select the LastName column of the SalesLT.Customer table in db1] Step 4: Column Selection page In this page, you select columns you want to encrypt, re-encrypt, or decrypt, and you define the target encryption configuration for the selected columns.
Step 5: If you want the wizard to encrypt or re-encrypt one or more columns using a new column encryption key, pick a key containing (New) in its name. The wizard will generate the key.
Step 6: Master Key Configuration page
If you have selected an autogenerated column encryption key for any column on the previous page, in this page you need to either select an existing column master key or configure a new column master key that will encrypt the column encryption key.
When configuring a new column master key, you can either pick an existing key in Windows Certificate Store or in Azure Key Vault and have the wizard to create just a metadata object for the key in the database, or you can choose to generate both the key and the metadata object describing the key in the database.
Step 7: Click "Next" and follow the wizard's instructions to complete the process.
Solution 2 [A bit more complex. Use for reference]
To encrypt a specific column using Always Encrypted with a Windows Certificate Store, you'll first need to set up a Column Master Key (CMK) [Task 1] and a Column Encryption Key (CEK) [Task
2]. The CMK will be protected by a certificate in the Windows Certificate Store, and the CEK will then be used to encrypt the data in your chosen column.
Task 1: Set up a Column Master Key (CMK)
Create a self-signed certificate using SQL Server Management Studio (SSMS) Provision Column Master Keys with the New Column Master Key Dialog The New Column Master Key dialog allows you to generate a column master key or pick an existing key in a key store, and create column master key metadata for the created or selected key in the database.
Step 1: Using Object Explorer, navigate to the Security -> Always Encrypted Keys node under your database.
Step 2: Right-click on the Column Master Keys node and select New Column Master Key....
Step 3: In the New Column Master Key dialog, enter the name of the column master key metadata object.
Step 4: Select a key store:
*-> Certificate Store - Current User - indicates the Current User certificate store location in the Windows Certificate Store, which is your personal store. [Select this one] Certificate Store - Local computer Azure Key Vault Etc.
Step 5: If you're using SQL Server 2019 (15.x) and your SQL Server instance is configured with a secure enclave, you can select the Allow enclave computations checkbox to make the master key enclave-enabled. [Skip] Step 6: Pick an existing key in your key store, or click the Generate Key or Generate Certificate button, to create a key in the key store. [Pick existing] Step 7: Click OK and the new key appears in the list.
Task 2: Create a Column Encryption Key (CEK)
1. In SSMS, navigate to "Always Encrypted Keys" -> "Column Encryption Keys".
2. Right-click and select "New Column Encryption Key".
3. Give the CEK a name.
4. Select the CMK you just created. [Created in Stage 1]
5. Click "OK".
Task 3: Encrypt the data in your chosen column.
1. Right-click on the specific column you want to encrypt in Object Explorer. [Select the LastName column of the SalesLT.Customer table in db1]
2. Select "Tasks" -> "Encrypt Columns".
3. The Always Encrypted Wizard will guide you through the process:
4. Select the column you want to encrypt. [Select the LastName column of the SalesLT.Customer table in db1]
5. Choose the encryption type (Deterministic or Randomized).
6. Select the CEK you created. [Created in Stage 2]
7. Click "Next" and follow the wizard's instructions to complete the process.
Reference:
https://learn.microsoft.com/en-us/sql/relational-databases/security/encryption/always-encrypted- wizard
https://learn.microsoft.com/en-us/sql/relational-databases/security/encryption/create-and-store- column-master-keys-always-encrypted
Question 6
Hotspot Question
You are performing exploratory analysis of bus fare data in an Azure Data Lake Storage Gen2 account by using an Azure Synapse Analytics serverless SQL pool.
You execute the Transact-SQL query shown in the following exhibit.

Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic.
Correct Answer:

Explanation:
Box 1: CSV files that have file named beginning with "tripdata_2020"
Box 2: a header
FIRSTROW = 'first_row'
Specifies the number of the first row to load. The default is 1 and indicates the first row in the specified data file. The row numbers are determined by counting the row terminators. FIRSTROW is 1-based.
Reference:
https://docs.microsoft.com/en-us/azure/synapse-analytics/sql/develop-openrowset
Question 7
Case Study 5 - ADatum Corporation
Overview
ADatum Corporation is a financial services company that has a main office in New York City.
Existing Environment
Licensing Agreement
ADatum has a Microsoft Volume Licensing agreement that includes Software Assurance.
Network Infrastructure
ADatum has an on-premises datacenter and an Azure subscription named Sub1.
Sub1 contains a virtual network named Network1 in the East US Azure region.
The datacenter is connected to Network1 by using a Site-to-Site (S2S) VPN.
Identity Environment
The on-premises network contains an Active Directory Domain Services (AD DS) forest.
The forest contains a single domain named corp.adatum.com.
The corp.adatum.com domain syncs with a Microsoft Entra tenant named adatum.com.
Database Environment
The datacenter contains the servers shown in the following table.

DB1 and DB2 are used for transactional and analytical workloads by an application named App1.
App1 runs on Microsoft Entra hybrid joined servers that run Windows Server 2022. App1 uses Kerberos authentication.
DB3 stores compliance data used by two applications named App2 and App3.
DB3 performance is monitored by using Extended Events sessions, with the event_file target set to a file share on a local disk of SVR3.
Resource allocation for DB3 is managed by using Resource Governor.
Requirements
Planned Changes
ADatum plans to implement the following changes:
- Deploy an Azure SQL managed instance named Instance1 to Network1.
- Migrate DB1 and DB2 to Instance1.
- Migrate DB3 to Azure SQL Database.
- Following the migration of DB1 and DB2, hand over database development to remote developers who use Microsoft Entra joined Windows 11 devices.
- Following the migration of DB3, configure the database to be part of an auto-failover group.
Availability Requirements
ADatum identifies the following post-migration availability requirements:
- For DB1 and DB2, offload analytical workloads to a read-only database replica in the same Azure region.
- Ensure that if a regional disaster occurs, DB1 and DB2 can be recovered from backups.
- After the migration, App1 must maintain access to DB1 and DB2.
- For DB3, manage potential performance issues caused by resource demand changes by App2 and App3.
- Ensure that DB3 will still be accessible following a planned failover.
- Ensure that DB3 can be restored if the logical server is deleted.
- Minimize downtime during the migration of DB1 and DB2.
Security Requirements
ADatum identifies the following security requirements for after the migration:
- Ensure that only designated developers who use Microsoft Entra joined Windows 11 devices can access DB1 and DB2 remotely.
- Ensure that all changes to DB3, including ones within individual transactions, are audited and recorded.
Management Requirements
ADatum identifies the following post-migration management requirements:
- Continue using Extended Events to monitor DB3.
- In Azure SQL Database, automate the management of DB3 by using elastic jobs that have database-scoped credentials.
Business Requirements
ADatum identifies the following business requirements:
- Minimize costs whenever possible, without affecting other requirements.
- Minimize administrative effort.
You need to recommend a process to automate the management of DB3. The solution must meet the management requirements.
What should be the first step of the process?

Correct Answer: A
Question 8
Drag and Drop Question
You have SQL Server on an Azure virtual machine.
You need to use Policy-Based Management in Microsoft SQL Server to identify stored procedures that do not comply with your naming conventions.
Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
Correct Answer:

Explanation:
First create a condition, then a custom policy based on the condition, finally run a policy evaluation.
Reference:
https://www.mssqltips.com/sqlservertip/2298/enforce-sql-server-database-naming-conventions- using-policy-based-management/
Question 9
Case Study 2 - Contoso, Ltd
Overview
General Overview
Contoso, Ltd. is a financial data company that has 100 employees. The company delivers financial data to customers.
Physical Locations
Contoso has a datacenter in Los Angeles and an Azure subscription. All Azure resources are in the US West 2 Azure region. Contoso has a 10-Gb ExpressRoute connection to Azure.
The company has customers worldwide.
Existing Environment
Active Directory
Contoso has a hybrid Azure Active Directory (Azure AD) deployment that syncs to on-premises Active Directory.
Database Environment
Contoso has SQL Server 2017 on Azure virtual machines shown in the following table.

SQL1 and SQL2 are in an Always On availability group and are actively queried. SQL3 runs jobs, provides historical data, and handles the delivery of data to customers.
The on-premises datacenter contains a PostgreSQL server that has a 50-TB database.
Current Business Model
Contoso uses Microsoft SQL Server Integration Services (SSIS) to create flat files for customers.
The customers receive the files by using FTP.
Requirements
Planned Changes
Contoso plans to move to a model in which they deliver data to customer databases that run as platform as a service (PaaS) offerings. When a customer establishes a service agreement with Contoso, a separate resource group that contains an Azure SQL database will be provisioned for the customer. The database will have a complete copy of the financial data. The data to which each customer will have access will depend on the service agreement tier. The customers can change tiers by changing their service agreement.
The estimated size of each PaaS database is 1 TB.
Contoso plans to implement the following changes:
* Move the PostgreSQL database to Azure Database for PostgreSQL during the next six months.
* Upgrade SQL1, SQL2, and SQL3 to SQL Server 2019 during the next few months.
* Start onboarding customers to the new PaaS solution within six months.
Business Goals
Contoso identifies the following business requirements:
* Use built-in Azure features whenever possible.
* Minimize development effort whenever possible.
* Minimize the compute costs of the PaaS solutions.
* Provide all the customers with their own copy of the database by using the PaaS solution.
* Provide the customers with different table and row access based on the customer's service agreement.
* In the event of an Azure regional outage, ensure that the customers can access the PaaS solution with minimal downtime. The solution must provide automatic failover.
* Ensure that users of the PaaS solution can create their own database objects but he prevented from modifying any of the existing database objects supplied by Contoso.
Technical Requirements
Contoso identifies the following technical requirements:
* Users of the PaaS solution must be able to sign in by using their own corporate Azure AD credentials or have Azure AD credentials supplied to them by Contoso. The solution must avoid using the internal Azure AD of Contoso to minimize guest users.
* All customers must have their own resource group, Azure SQL server, and Azure SQL database. The deployment of resources for each customer must be done in a consistent fashion.
* Users must be able to review the queries issued against the PaaS databases and identify any new objects created.
* Downtime during the PostgreSQL database migration must be minimized.
Monitoring Requirements
Contoso identifies the following monitoring requirements:
* Notify administrators when a PaaS database has a higher than average CPU usage.
* Use a single dashboard to review security and audit data for all the PaaS databases.
* Use a single dashboard to monitor query performance and bottlenecks across all the PaaS databases.
* Monitor the PaaS databases to identify poorly performing queries and resolve query performance issues automatically whenever possible.
PaaS Prototype
During prototyping of the PaaS solution in Azure, you record the compute utilization of a customer's Azure SQL database as shown in the following exhibit.

Role Assignments
For each customer's Azure SQL Database server, you plan to assign the roles shown in the following exhibit.

Hotspot Question
You are evaluating the role assignments.
For each of the following statements, select Yes if the statement is true. Otherwise, select No.
NOTE: Each correct selection is worth one point.
Correct Answer:

Explanation:
Box 1: No
DBAGroup1 is member of the Contributor role.
The Contributor role grants full access to manage all resources, but does not allow you to assign roles in Azure RBAC, manage assignments in Azure Blueprints, or share image galleries.
Box 2: No
Contributor - Grants full access to manage all resources, but does not allow you to assign roles in Azure RBAC SQL DB Contributor - Lets you manage SQL databases, but not access to them. Also, you can't manage their security-related policies or their parent SQL servers.
Box 3: Yes
DBAGroup2 is member of the SQL DB Contributor role.
The SQL DB Contributor role lets you manage SQL databases, but not access to them. Also, you can't manage their security-related policies or their parent SQL servers. As a member of this role you can create and manage SQL databases.
Reference:
https://docs.microsoft.com/en-us/azure/role-based-access-control/built-in-roles
Question 10
You have an Azure subscription that contains an Azure SQL database named SQL1.
SQL1 is in an Azure region that does not support availability zones.
You need to ensure that you have a secondary replica of SQLI in the same region.
What should you use?

Correct Answer: B
Explanation: Only visible for TestSimulate members. You can sign-up / login (it's free).
Question 11
Your company wants to create an Azure SQL Database. The database will contain a table that would have Employee information. There are columns which contain sensitive information. You have to enabled Always Encryption for the data stored in the Employee table.
Which of the following could be used to store the encryption keys required for enabling Always Encrypted for the table?

Correct Answer: A,B
Explanation: Only visible for TestSimulate members. You can sign-up / login (it's free).
Question 12
You have an on-premises Microsoft SQL Server 2019 server that hosts a database named DB1.
You have an Azure subscription that contains an Azure SQL managed instance named SQLMI1 and a virtual network named VNET1. SQLMI1 resides on VNET1. The on-premises network connects to VNET1 by using an ExpressRoute connection.
You plan to migrate DB1 to SQLMI1 by using Azure Database Migration Service.
You need to configure VNET1 to support the migration.
What should you do?

Correct Answer: D
Explanation: Only visible for TestSimulate members. You can sign-up / login (it's free).
Question 13
You have an Azure SQL database named DB1. You run a query while connected to DB1.
You review the actual execution plan for the query, and you add an index to a table referenced by the query.
You need to compare the previous actual execution plan for the query to the Live Query Statistics.
What should you do first in Microsoft SQL Server Management Studio (SSMS)?

Correct Answer: A
Explanation: Only visible for TestSimulate members. You can sign-up / login (it's free).
Question 14
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have SQL Server 2019 on an Azure virtual machine.
You are troubleshooting performance issues for a query in a SQL Server instance.
To gather more information, you query sys.dm_exec_requests and discover that the wait type is PAGELATCH_UP and the wait_resource is 2:3:905856.
You need to improve system performance.
Solution: You reduce the use of table variables and temporary tables.
Does this meet the goal?

Correct Answer: B
Explanation: Only visible for TestSimulate members. You can sign-up / login (it's free).
Question 15
Case Study 1 - Litware, Inc
Overview
Litware, Inc. is a renewable energy company that has a main office in Boston. The main office hosts a sales department and the primary datacenter for the company.
Physical Locations
Existing Environment
Litware has a manufacturing office and a research office is separate locations near Boston. Each office has its own datacenter and internet connection.
The manufacturing and research datacenters connect to the primary datacenter by using a VPN.
Network Environment
The primary datacenter has an ExpressRoute connection that uses both Microsoft peering and private peering. The private peering connects to an Azure virtual network named HubVNet.
Identity Environment
Litware has a hybrid Azure Active Directory (Azure AD) deployment that uses a domain named litwareinc.com. All Azure subscriptions are associated to the litwareinc.com Azure AD tenant.
Database Environment
The sales department has the following database workload:
* An on-premises named SERVER1 hosts an instance of Microsoft SQL Server 2012 and two 1- TB databases.
* A logical server named SalesSrv01A contains a geo-replicated Azure SQL database named SalesSQLDb1. SalesSQLDb1 is in an elastic pool named SalesSQLDb1Pool. SalesSQLDb1 uses database firewall rules and contained database users.
* An application named SalesSQLDb1App1 uses SalesSQLDb1.
The manufacturing office contains two on-premises SQL Server 2016 servers named SERVER2 and SERVER3. The servers are nodes in the same Always On availability group. The availability group contains a database named ManufacturingSQLDb1 Database administrators have two Azure virtual machines in HubVnet named VM1 and VM2 that run Windows Server 2019 and are used to manage all the Azure databases.
Licensing Agreement
Litware is a Microsoft Volume Licensing customer that has License Mobility through Software Assurance.
Current Problems
SalesSQLDb1 experiences performance issues that are likely due to out-of-date statistics and frequent blocking queries.
Requirements
Planned Changes
Litware plans to implement the following changes:
* Implement 30 new databases in Azure, which will be used by time-sensitive manufacturing apps that have varying usage patterns. Each database will be approximately 20 GB.
* Create a new Azure SQL database named ResearchDB1 on a logical server named ResearchSrv01. ResearchDB1 will contain Personally Identifiable Information (PII) data.
* Develop an app named ResearchApp1 that will be used by the research department to populate and access ResearchDB1.
* Migrate ManufacturingSQLDb1 to the Azure virtual machine platform.
* Migrate the SERVER1 databases to the Azure SQL Database platform.
Technical Requirements
Litware identifies the following technical requirements:
* Maintenance tasks must be automated.
* The 30 new databases must scale automatically.
* The use of an on-premises infrastructure must be minimized.
* Azure Hybrid Use Benefits must be leveraged for Azure SQL Database deployments.
* All SQL Server and Azure SQL Database metrics related to CPU and storage usage and limits must be analyzed by using Azure built-in functionality.
Security and Compliance Requirements
Litware identifies the following security and compliance requirements:
* Store encryption keys in Azure Key Vault.
* Retain backups of the PII data for two months.
* Encrypt the PII data at rest, in transit, and in use.
* Use the principle of least privilege whenever possible.
* Authenticate database users by using Active Directory credentials.
* Protect Azure SQL Database instances by using database-level firewall rules.
* Ensure that all databases hosted in Azure are accessible from VM1 and VM2 without relying on public endpoints.
Business Requirements
Litware identifies the following business requirements:
* Meet an SLA of 99.99% availability for all Azure deployments.
* Minimize downtime during the migration of the SERVER1 databases.
* Use the Azure Hybrid Use Benefits when migrating workloads to Azure.
* Once all requirements are met, minimize costs whenever possible.
Hotspot Question
You need to recommend the appropriate purchasing model and deployment option for the 30 new databases. The solution must meet the technical requirements and the business requirements.
What should you recommend? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Correct Answer:

Explanation:
Box 1: Vcore
Scenario:
- The 30 new databases must scale automatically.
- Once all requirements are met, minimize costs whenever possible.
You can configure resources for the pool based either on the DTU-based purchasing model or the vCore-based purchasing model.
In short, for simplicity, the DTU model has an advantage. Plus, if you're just getting started with Azure SQL Database, the DTU model offers more options at the lower end of performance, so you can get started at a lower price point than with vCore.
Box 2: An Azure SQL database elastic pool
Azure SQL Database elastic pools are a simple, cost-effective solution for managing and scaling multiple databases that have varying and unpredictable usage demands. The databases in an elastic pool are on a single server and share a set number of resources at a set price. Elastic pools in Azure SQL Database enable SaaS developers to optimize the price performance for a group of databases within a prescribed budget while delivering performance elasticity for each database.
Reference:
https://azure.microsoft.com/es-es/blog/a-flexible-new-way-to-purchase-azure-sql-database/
https://docs.microsoft.com/en-us/azure/azure-sql/database/elastic-pool-overview
https://docs.microsoft.com/en-us/azure/azure-sql/database/reserved-capacity-overview
Question 16
You are creating a new notebook in Azure Databricks that will support R as the primary language but will also support Scala and SQL.
Which switch should you use to switch between languages?

Correct Answer: A
Explanation: Only visible for TestSimulate members. You can sign-up / login (it's free).