The latest updated Microsoft DP-201 exam dumps and free DP-201 exam practice questions and answers! Latest updates from Lead4Pass Microsoft DP-201 Dumps PDF and DP-201 Dumps VCE, Lead4Pass DP-201 exam questions updated and answers corrected!
Get the full Microsoft DP-201 dumps from https://www.leads4pass.com/dp-201.html (VCE&PDF)

Latest DP-201 PDF for free

Share the Microsoft DP-201 Dumps PDF for free From Lead4pass DP-201 Dumps part of the distraction collected on Google Drive shared by Lead4pass
https://drive.google.com/file/d/1RbM5w1o-muSk75DYtTlx2J0JyoU00se1/

Latest Lead4pass DP-201 Youtube

Share the latest Microsoft DP-201 exam practice questions and answers for free from Led4Pass Dumps viewed online by Youtube Videos

https://youtube.com/watch?v=D115zJSWKlw

The latest updated Microsoft DP-201 Exam Practice Questions and Answers Online Practice Test is free to share from Lead4Pass (Q1-Q13)

QUESTION 1
You are planning a big data solution in Azure.
You need to recommend a technology that meets the following requirements:
1.
Be optimized for batch processing.
2.
Support autoscaling.
3.
Support per-cluster scaling. Which technology should you recommend?
A. Azure Data Warehouse
B. Azure HDInsight with Spark
C. Azure Analysis Services
D. Azure Databricks
Correct Answer: D
Azure Databricks is an Apache Spark-based analytics platform. Azure Databricks supports autoscaling and manages
the Spark cluster for you.
Incorrect Answers: A, B:[2021.1] lead4pass dp-201 practice test q1

 

QUESTION 2
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains
a unique solution that might meet the stated goals. Some question sets might have more than one correct solution,
while
others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not
appear on the review screen.
You are designing an Azure SQL Database that will use elastic pools. You plan to store data about customers in a table.
Each record uses a value for CustomerID.
You need to recommend a strategy to partition data based on values in CustomerID.
Proposed Solution: Separate data into customer regions by using vertical partitioning.
Does the solution meet the goal?
A. Yes
B. No
Correct Answer: B
Vertical partitioning is used for cross-database queries. Instead, we should use Horizontal Partitioning, which also is
called charging.
References:
https://docs.microsoft.com/en-us/azure/sql-database/sql-database-elastic-query-overview

 

QUESTION 3
You plan to use an Azure SQL data warehouse to store the customer data. You need to recommend a disaster recovery
solution for the data warehouse. What should you include in the recommendation?
A. AzCopy
B. Read-only replicas
C. AdICopy
D. Geo-Redundant backups
Correct Answer: D
https://docs.microsoft.com/en-us/azure/sql-data-warehouse/backup-and-restore

 

QUESTION 4
You need to ensure that emergency road response vehicles are dispatched automatically.
How should you design the processing system? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:[2021.1] lead4pass dp-201 practice test q4

Correct Answer:

[2021.1] lead4pass dp-201 practice test q4-1

Explanation: Box1: API App

[2021.1] lead4pass dp-201 practice test q4-2

Events generated from the IoT data sources are sent to the stream ingestion layer through Azure HDInsight Kafka as a
stream of messages. HDInsight Kafka stores a stream of data in topics for a configurable of time.
Kafka consumer, Azure Databricks, picks up the message in real-time from the Kafka topic, to process the data based
on the business logic and can then send it to the Serving layer for storage.
Downstream storage services, like Azure Cosmos DB, Azure SQL Data warehouse, or Azure SQL DB, will then be a
data source for presentation and action layer.
Business analysts can use Microsoft Power BI to analyze warehoused data. Other applications can be built upon the
serving layer as well. For example, we can expose APIs based on the service layer data for third party uses.
Box 2: Cosmos DB Change Feed
Change feed support in Azure Cosmos DB works by listening to an Azure Cosmos DB container for any changes. It
then outputs the sorted list of documents that were changed in the order in which they were modified.
The change feed in Azure Cosmos DB enables you to build efficient and scalable solutions for each of these patterns,
as shown in the following image:

[2021.1] lead4pass dp-201 practice test q4-3

 

QUESTION 5
HOTSPOT
You are designing a new application that uses Azure Cosmos DB. The application will support a variety of data patterns
including log records and social media mentions.
You need to recommend which Cosmos DB API to use for each data pattern. The solution must minimize resource
utilization.
Which API should you recommend for each data pattern? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:[2021.1] lead4pass dp-201 practice test q5

Correct Answer:

[2021.1] lead4pass dp-201 practice test q5-1

Log records: SQL
Social media mentions: Gremlin You can store the actual graph of followers using Azure Cosmos DB Gremlin API to
create vertexes for each user and edges that maintain the “A-follows-B” relationships. With the Gremlin API, you can get
the followers of a certain user and create more complex queries to suggest people in common. If you add to the graph
the Content Categories that people like or enjoy, you can start weaving experiences that include smart content
discovery, suggesting content that those people you follow like, or finding people that you might have much in common
with.
References: https://docs.microsoft.com/en-us/azure/cosmos-db/social-media-apps

 

QUESTION 6
You need to design the SensorData collection.
What should you recommend? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:[2021.1] lead4pass dp-201 practice test q6

Correct Answer:

[2021.1] lead4pass dp-201 practice test q6-1

Box 1: Eventual
Traffic data insertion rate must be maximized.
Sensor data must be stored in a Cosmos DB named try data in a collection named SensorData
With Azure Cosmos DB, developers can choose from five well-defined consistency models on the consistency
spectrum. From strongest to more relaxed, the models include strong, bounded staleness, session, consistent prefix,
and eventual
consistency.
Box 2: License plate
This solution reports on all data related to a specific vehicle license plate. The report must use data from the
SensorData collection.
References:
https://docs.microsoft.com/en-us/azure/cosmos-db/consistency-levels

 

QUESTION 7
You need to recommend a solution for storing the image tagging data. What should you recommend?
A. Azure File Storage
B. Azure Cosmos DB
C. Azure Blob Storage
D. Azure SQL Database
E. Azure SQL Data Warehouse
Correct Answer: C
Image data must be stored in a single data store at a minimum cost.
Note: Azure Blob storage is Microsoft\\’s object storage solution for the cloud. Blob storage is optimized for storing
massive amounts of unstructured data. Unstructured data is data that does not adhere to a particular data model or
definition,
such as text or binary data.
Blob storage is designed for:
1.
Serving images or documents directly to a browser.
2.
Storing files for distributed access.
3.
Streaming video and audio.
4.
Writing to log files.
5.
Storing data for backup and restore disaster recovery, and archiving.
6.
Storing data for analysis by an on-premises or Azure-hosted service.
References: https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blobs-introduction

 

QUESTION 8
You need to design a telemetry data solution that supports the analysis of log files in real-time.
Which two Azure services should you include in the solution? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.
A. Azure Databricks
B. Azure Data Factory
C. Azure Event Hubs
D. Azure Data Lake Storage Gent 2
E. Azure IoT Hub
Correct Answer: AC
You connect a data ingestion system with Azure Databricks to stream data into an Apache Spark cluster in near realtime. You set up a data ingestion system using Azure Event Hubs and then connect it to Azure Databricks to process the
messages coming through.
Note: Azure Event Hubs is a highly scalable data streaming platform and event ingestion service, capable of receiving
and processing millions of events per second. Event Hubs can process and store events, data, or telemetry produced by
distributed software and devices. Data sent to an event hub can be transformed and stored using any real-time analytics
provider or batching/storage adapters.
References: https://docs.microsoft.com/en-us/azure/azure-databricks/databricks-stream-from-eventhubs

 

QUESTION 9
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains
a unique solution that might meet the stated goals. Some question sets might have more than one correct solution,
while
others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not
appear on the review screen.
You plan to store delimited text files in an Azure Data Lake Storage account that will be organized into the department
folders.
You need to configure data access so that users see only the files in their respective department folders.
Solution: From the storage account, you disable a hierarchical namespace, and you use access control lists (ACLs).
Does this meet the goal?
A. Yes
B. No
Correct Answer: A
Azure Data Lake Storage implements an access control model that derives from HDFS, which in turn derives from the
POSIX access control model.
Blob container ACLs do not support the hierarchical namespace, so it must be disabled.
References:
https://docs.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-known-issues
https://docs.microsoft.com/en-us/azure/data-lake-store/data-lake-store-access-control

 

QUESTION 10
You need to recommend a storage solution for a sales system that will receive thousands of small files per minute. The
files will be in JSON, text, and CSV formats. The files will be processed and transformed before they are loaded into an
Azure data warehouse. The files must be stored and secured in folders.
Which storage solution should you recommend?
A. Azure Data Lake Storage Gen2
B. Azure Cosmos DB
C. Azure SQL Database
D. Azure Blob storage
Correct Answer: A
Azure provides several solutions for working with CSV and JSON files, depending on your needs. The primary landing
place for these files is either Azure Storage or Azure Data Lake Store.1
Azure Data Lake Storage is optimized storage for big data analytics workloads.
Incorrect Answers:
D: Azure Blob Storage containers is a general-purpose object store for a wide variety of storage scenarios. Blobs are
stored in containers, which are similar to folders.
References: https://docs.microsoft.com/en-us/azure/architecture/data-guide/scenarios/csv-and-json

 

QUESTION 11
You need to design the Planning Assistance database.
For each of the following statements, select Yes if the statement is true. Otherwise, select No.
NOTE: Each correct selection is worth one point.
Hot Area:[2021.1] lead4pass dp-201 practice test q11

Correct Answer:

[2021.1] lead4pass dp-201 practice test q11-1

Box 1: No
Data used for Planning Assistance must be stored in a sharded Azure SQL Database.
Box 2: Yes
Box 3: Yes
Planning Assistance database will include reports tracking the travel of a single-vehicle


QUESTION 12
What should you recommend using to secure sensitive customer contact information?
A. data labels
B. column-level security
C. row-level security
D. Transparent Data Encryption (TDE)
Correct Answer: B
Scenario: All cloud data must be encrypted at rest and in transit.
Always Encrypted is a feature designed to protect sensitive data stored in specific database columns from access (for
example, credit card numbers, national identification numbers, or data on a need to know basis). This includes database
administrators or other privileged users who are authorized to access the database to perform management tasks but
have no business need to access the particular data in the encrypted columns. The data is always encrypted, which
means the encrypted data is decrypted only for processing by client applications with access to the encryption key.
Incorrect Answers:
A: Transparent Data Encryption (TDE) encrypts SQL Server, Azure SQL Database, and Azure SQL Data Warehouse
data files, known as encrypting data at rest. TDE does not provide encryption across communication channels.
References: https://docs.microsoft.com/en-us/azure/sql-database/sql-database-security-overview

 

QUESTION 13
You need to design the unauthorized data usage detection system. What Azure service should you include in the
design?
A. Azure Analysis Services
B. Azure SQL Data Warehouse
C. Azure Databricks
D. Azure Data Factory
Correct Answer: B
SQL Database and SQL Data Warehouse
SQL threat detection identifies anomalous activities indicating unusual and potentially harmful attempts to access or
exploit databases.
Advanced Threat Protection for Azure SQL Database and SQL Data Warehouse detects anomalous activities indicating
unusual and potentially harmful attempts to access or exploit databases.
Scenario:
Requirements. Security
The solution must meet the following security requirements:
Unauthorized usage of data must be detected in real-time. Unauthorized usage is determined by looking for unusual
usage patterns.
Reference:
https://docs.microsoft.com/en-us/azure/sql-database/sql-database-threat-detection-overview


Fulldumps shares the latest updated Microsoft DP-201 exam exercise questions, DP-201 dumps pdf, and Youtube video learning for free.
All exam questions and answers come from the Lead4pass exam dumps shared part! Lead4pass updates throughout the year and shares a portion of your exam questions for free to help you understand the exam content and enhance your exam experience!
Get the full Microsoft DP-201 exam dumps questions at https://www.leads4pass.com/dp-201.html (pdf&vce)

ps.
Get free Microsoft DP-201 dumps PDF online: https://drive.google.com/file/d/1RbM5w1o-muSk75DYtTlx2J0JyoU00se1/