Certification Practice Test Sample Questions For Microsoft Implementing an Azure Data Solution (DP-200)
QuickStart is now offering sample questions for Microsoft Implementing an Azure Data Solution (DP-200). Whether you are deciding which exam to sign up for, or simply want to practice the materials necessary to complete certification for this course, we have provided a practice test to better aid in certification. 100% of the questions are real test questions; from a recent version of the Microsoft Implementing an Azure Data Solution (DP-200) exam.
Microsoft Azure Data Fundamentals (DP-900T00)
Enroll now today and get 30% off using discount code PRACTICE30 at checkout.
Azure Data Engineer Certification: Data Engineering on Microsoft Azure (DP-203T00)
Enroll now today and get 30% off using discount code at checkout.
Azure Data Engineer Certification: Data Engineering on Microsoft Azure (DP-203T00)
Enroll now today and get 30% off using discount code at checkout.
Microsoft Implementing an Azure Data Solution (DP-200) Sample Exam Questions
You work for an organization as a data engineer. The organization wants to introduce an e-commerce website that would be a marketplace. Different vendors will be selling their products on the platform. There are arbitrary number of products with each seller. There are five different attributes for products. For example, VendorA offers a product named Watch Z and is has these attributes price, description, brand, part-number, and size. VendorB offers a similar product named Watch Z that contains attributes price, description, weight, and model. You are expected to implement a solution that does not restrict the product attributes each vendor is using for its respective product. You must use .NET to query product data in your solution. Solution: There needs to be an Azure SQL Database account created and it uses a managed instance. Do you think this is the right solution in this situation?
You work for an organization as a data engineer. The organization wants to introduce an e-commerce website that would be a marketplace. Different vendors will be selling their products on the platform. There are arbitrary number of products with each seller. There are five different attributes for products. For example, VendorA offers a product named Watch Z and is has these attributes price, description, brand, part-number, and size. VendorB offers a similar product named Watch Z that contains attributes price, description, weight, and model. You are expected to implement a solution that does not restrict the product attributes each vendor is using for its respective product. You must use .NET to query product data in your solution. Solution: A table storage account is created by you. Do you think this is the right solution in this situation?
You work for an organization as a data engineer. The organization wants to introduce an e-commerce website that would be a marketplace. Different vendors will be selling their products on the platform. There are arbitrary number of products with each seller. There are five different attributes for products. For example, VendorA offers a product named Watch Z and is has these attributes price, description, brand, part-number, and size. VendorB offers a similar product named Watch Z that contains attributes price, description, weight, and model. You are expected to implement a solution that does not restrict the product attributes each vendor is using for its respective product. You must use .NET to query product data in your solution. Solution: There needs to be a Cosmos DB account created and it uses the SQL API. Do you think this is the right solution in this situation?
An organization has hired you as a data engineer. You are required to develop a solution that takes in data contained in large pipe-delimited text files in an Azure Data Lake Gen 1 storage account and ingests it in Azure Data Warehouse. You now have to load the data. Solution: Following are the steps you take -Create an external file format and external data source -Create an external table that uses the external data source -Load the data from the external table Do you think this is the right solution in this situation?
You work for an organization as a data engineer. The organization wants to introduce an e-commerce website that would be a marketplace. Different vendors will be selling their products on the platform. There are arbitrary number of products with each seller. There are five different attributes for products. For example, VendorA offers a product named Watch Z and is has these attributes price, description, brand, part-number, and size. VendorB offers a similar product named Watch Z that contains attributes price, description, weight, and model. You are expected to implement a solution that does not restrict the product attributes each vendor is using for its respective product. You must use .NET to query product data in your solution. Solution: There needs to be a Cosmos DB account created and it uses the Table API. Do you think this is the right solution in this situation?
An organization has hired you as a data engineer. You are required to develop a solution that takes in data contained in large pipe-delimited text files in an Azure Data Lake Gen 1 storage account and ingests it in Azure Data Warehouse. You now have to load the data. Solution: Following are the steps you take -Create an external file format and external data source -Create an external table that uses the external data source -Load the data from the external table. Do you think this is the right solution in this situation?
An organization has hired you as a data engineer. You are required to develop a solution that takes in data contained in large pipe-delimited text files in an Azure Data Lake Gen 1 storage account and ingests it in Azure Data Warehouse. You now have to load the data. Solution: Following are the steps you take: -Create an external file format and external data source -Create an external table that uses the external data source -Load the data from the external table. Do you think this is the right solution in this situation?
An organization has hired you as a data engineer. You are required to develop a solution that takes in data contained in large pipe-delimited text files in an Azure Data Lake Gen 1 storage account and ingests it in Azure Data Warehouse. You now have to load the data. Solution: Following are the steps you take: -Create an Azure Databricks account and a linked server - Create an external table that points to the Azure Databricks account -Load the data by running the dbutils.fs.cp command Do you think this is the right solution in this situation?
You are hired as a data engineer for a company that manufacturers vehicles. The company is autonomous. The vehicles are manufactured with a transmitter installed in them. The transmitter submits sensor data over Advanced Message Queuing Protocol (AMQP). You are required to take out relevant information from the transmitter and transform it in real time so it could be sent to Power BI. Following are the steps you take to implement the solution: -Create an loT Hub instance -Create a Stream Analytics job that uses a query to extract data Do you think this is the right solution in this situation?
You are hired as a data engineer for a company that manufacturers vehicles. The company is autonomous. The vehicles are manufactured with a transmitter installed in them. The transmitter submits sensor data over Advanced Message Queuing Protocol (AMQP). You are required to take out relevant information from the transmitter and transform it in real time so it could be sent to Power BI. Following are the steps you take to implement the solution: -Create an Azure Relay service -Create an Azure Function app that extracts and queries data from Azure Relay. Do you think this is the right solution in this situation?
You are required to send exam data to Azure via test center web application. Choose from the following you think is the appropriate technology to receive the data.
- A. Event Hub
-
Correct!
- B. Azure Relay
-
Incorrect.
- C. Azure databricks
-
Incorrect.
- D. Event Grid
-
Incorrect.
You want to query data, filter the data and then send the data to Power BI for further use and data analysis. Which of the following technologies will you use?
- A. Stream Analytics
-
Correct!
- B. WebJob
-
Incorrect.
- C. HDlnsight
-
Incorrect.
- D. Function app
-
Incorrect.
You are required to select the most appropriate windowing function. Can you identify which one of the following would you use?
Can you identify the two data formats you should use if you need to add test data for analysis in Azure? (opt any TWO)
A company has hired you as a data engineer. You are required store a product catalog that you have, for which you create an Azure Cosmos DB account. The existing product catalog currently exists as a single Oracle database table. There are empty columns in the table. Around 20 percent are empty. There is a possibility of different attribute names and a different attribute count for each product. If you search the catalog, you should be able to do it by product id and category. For clothing, you must be able to search by size and for Laptops, you must be able to search by CPU speed. You also must be able to query data by using the following syntax from a web application:
- A. MongoDB API
-
Incorrect.
- B. Core SQL API
-
Correct!
- C. Gremlin API
-
Incorrect.
- D. Table API
-
Incorrect.
Your company has hired you as a data engineer who is supposed to manage HDlnsight cluster. You want to use a web user interface to monitor the performance of the cluster. Which tool would you use for that?
- A. Apache Spark
-
Incorrect.
- B. Apache Ambari
-
Correct!
- C. Azure Log Analytics
-
Incorrect.
- D. Azure Stream Analytics
-
Incorrect.
Your company has hired you as a data engineer. You copy and transform data from Azure blob storage to an on-premises server using Azure Data Factory. What two actions would you perform to make sure the data is copied successfully? (opt any TWO)
- B. Create an Azure integration runtime
-
Incorrect.
Your company has hired you as a data engineer. Following is the query with which you create a table: CREATE TABLE License ( [ID] int, [Number] char(9) MASKED WITH (FUNCTION = 'partial(3, "xxxxx", 1)') NULL, [GivenName] varchar(20), [SurName] varchar(20) Here is the query you use to insert data: INSERT INTO License (ID, Number, GivenName. SurName) SELECT 1, '111222333', 'Sam'. 'Jack' And here is the query you use to return data: SELECT Number FROM License where ID=1 You need to determine which value is returned from the query. Identify the value that returned?
Your company has hired you as a data engineer. There is an HDlnsight cluster in Azure. What step would take if you want to monitor performance of the cluster?
- B. Create a Log Analytics workspace and use Azure Advisor.
-
Incorrect.
- D. Create a Log Analytics workspace and use Azure Advisor.
-
Incorrect.
Your company has hired you as a data engineer. Using Azure portal, you create an Azure Databricks account. There is data that needs to be ingested from blob storage into Databricks. You import a notebook from Github. Now you have to create the next resource in order for you to run code and ingest the data. Choose which of the following will you create next?
- A. Cosmos DB account
-
Incorrect.
- B. Master key
-
Incorrect.
- C. Spark cluster
-
Correct!
- D. SQL Data Warehouse instance
-
Incorrect.
You are hired by a company as a data engineer. The company has an Azure Databricks account and the account has an imported notebook. You also create an Azure data factory. You are required to make sure Databricks can be accessed by Data Factory. Identify which of the following will you create?
- A. An access policy on a blob storage container
-
Incorrect.
- B. An access token
-
Incorrect.
- C. A master A blob storage container
-
Incorrect.
- D. A blob storage container
-
Correct!
Your company hires you as a data engineer. In the US East region, your company has an Azure Data Factory pipeline. You get to know that pipeline run-data gets deleted after 45 days. What would you do if the company asks you to keep the run-data from the pipeline for more than 45 days?
- A. Re-create the Data Factory pipeline and enable Git
-
Incorrect.
- B. Re-create the Data Factory pipeline and enable Git
-
Incorrect.
- D. Add a lock with the CanNotDelete lock type
-
Incorrect.
There is a Microsoft SQL Server 2019 database in your company and the database is hosted on an Azure virtual machine (VM). There is a web application that uses the database as its data store. It is detected that the page loading speed of shopping cart page on the application is slow as told by a few customers. What would be your step to take if you are required to determine the stored procedure that is being called when this page is accessed?
- A. Create a SQL Server Profiler trace
-
Correct!
- B. Call the SET SHOWPLAN TEXT statement in Query Analyzer.
-
Incorrect.
You have been hired as a data engineer for your company. You have an Azure Databricks account for which Azure Monitor is used to monitor. You are required to opt for a solution so you can send application metrics to Azure Monitor. Identify the library or tool you would opt for.
A new e-commerce solution is planned to be deployed in a Microsoft Azure environment. Azure SQL Database is the relational data store in this environment. Active geo-replication of the primary database is enabled. It is replicated to a secondary database located in another Azure region. It is required by the Marketing team to have updated prices in Azure SQL Database as soon as the new product range launch is announced. You have to make sure the changes in prices in the primary database reflect in the secondary database without a lag. What is the best step of action here?
Tell Us About You:
- Home
- Sample Question - Microsoft Implementing an Azure Data Solution(DP-200)
Sample Question - Microsoft Implementing an Azure Data Solution(DP-200)
More Information:
- Learning Style: On Demand
- Learning Style: Practice Exam
- Difficulty: Beginner
- Course Duration: 1 Hour
- Course Info: Download PDF
- Certificate: See Sample
Contact a Learning Consultant
Need Training for 5 or More People?
Customized to your team's need:
- Annual Subscriptions
- Private Training
- Flexible Pricing
- Enterprise LMS
- Dedicated Customer Success Manager
Course Information
The Sample Question - Microsoft Implementing an Azure Data Solution (DP-200) page is designed to support IT professionals and data specialists preparing for the DP-200 certification exam. This exam assesses your ability to implement data solutions on the Microsoft Azure platform, covering essential areas such as data storage, data processing, and data security. The sample questions provided on this page simulate the types of queries and scenarios you’ll encounter in the actual exam, helping you to strengthen your practical skills in data management, architecture, and security within Azure.
This resource focuses on critical exam topics, including configuring and managing Azure data solutions, monitoring and optimizing data storage, handling big data, and implementing disaster recovery. By working through these sample questions, you can gain hands-on experience with Azure’s data services, learn how to troubleshoot common challenges, and become familiar with best practices in Azure data management. Each question is crafted to improve your confidence and help you identify areas for further study, so you can approach the DP-200 exam fully prepared.
Outline
Practice Certification Exam
Exam Test Sample : DP-200: Microsoft Implementing an Azure Data Solution