This exam measures your ability to accomplish the following technical tasks: implement data storage solutions; manage and develop data processing; and monitor and optimize data solutions.
Study Areas
- Implement data storage solutions (40-45%)
- Manage and develop data processing (25-30%)
- Monitor and optimize data solutions (30-35%)
- This weekend can make you Azure Certified
- Implement data solutions
- Azure Cosmos DB
Try our Azure Practice Labs
- What’s Your Version Of “YOU CAN, IF YOU..” - January 28, 2022
- 20 days road map plan for Microsoft Azure Certification - April 21, 2021
- Bird’s eye view of Azure Arc - April 21, 2021
Mentorstag Experience

Learn By Doing
Immersive hands-on training with a combination of theoretical learning, hands-on exercises, group discussions, assignments and intensive Q&A sessions.

Live & Interactive
Ask questions, get clarifications, and engage in discussions with instructors and other participants.

Mentored By Industry Experts
Get mentored by Industry practitioners having more than 10 years of experience.

Reason Based Learning
Don’t gain just theoretical or practical knowledge. Understand the WHAT, WHY, and HOW of a subject. Simplify the subject matter and get in-depth comprehension.
Contents
Module 01 - Implement non-relational data stores
- Implement a solution that uses Cosmos DB, Data Lake Storage Gen2, or Blob storage
- Implement data distribution and partitions
- Implement a consistency model in Cosmos DB
- Provision a non-relational data store
- Provide access to data to meet security requirements
- Implement for high availability, disaster recovery, and global distribution
Module 02 - Implement relational data stores
- configure elastic pools
- configure geo-replication
- provide access to data to meet security requirements
- implement for high availability, disaster recovery, and global distribution
- implement data distribution and partitions for Azure Synapse Analytics
- implement PolyBase
Module 03 - Manage data security
- Implement data masking
- Encrypt data at rest and in motion
Module 04 - Develop batch processing solutions
- Develop batch processing solutions by using Data Factory and Azure Databricks
- Ingest data by using PolyBase
- Implement the integration runtime for Data Factory
- Implement Copy Activity within Azure Data Factory
- Create linked services and datasets
- Create pipelines and activities
- Implement Mapping Data Flows in Azure Data Factory
- Create and schedule triggers
- Implement Azure Databricks clusters, notebooks, jobs, and autoscaling
- Ingest data into Azure Databricks
Module 05 - Develop streaming solutions
- Configure input and output
- Select the appropriate windowing functions
- Implement event processing by using Stream Analytics
Module 06 - Monitor data storage
- Monitor relational and non-relational data sources
- Implement Blob storage monitoring
- Implement Data Lake Storage monitoring
- Implement SQL Database monitoring
- Implement Azure Synapse Analytics monitoring
- Implement Cosmos DB monitoring
- Configure Azure Monitor alerts
- Implement auditing by using Azure Log Analytics
Module 07 - Monitor data processing
- Monitor Data Factory pipelines
- Monitor Azure Databricks
- Monitor Stream Analytics
- Configure Azure Monitor alerts
- Implement auditing by using Azure Log Analytics
Module 08 - Optimize Azure data solutions
- Troubleshoot data partitioning bottlenecks
- Optimize Data Lake Storage
- Optimize Stream Analytics
- Optimize Azure Synapse Analytics
- Optimize SQL Database
- Manage the data lifecycle
Looking for customized training content? Kindly get in touch.
Contact Us