• 40 hours live session

  • Industry case studies

  • 200 - training level

This exam measures your ability to accomplish the following technical tasks: implement data storage solutions; manage and develop data processing; and monitor and optimize data solutions.

Study Areas

  • Implement data storage solutions (40-45%)
  • Manage and develop data processing (25-30%)
  • Monitor and optimize data solutions (30-35%)
  • This weekend can make you Azure Certified
  • Implement data solutions
  • Azure Cosmos DB

Try our Azure Practice Labs

  • 15th May to 16th June
  • 2 PM to 6 PM (8 hours)
  • FREE Limited Registrations
  • Microsoft Official Courseware
  • Proven Certification plan
  • 1 on 1 Mentor Call

Mentorstag Experience

Learn By Doing

Immersive hands-on training with a combination of theoretical learning, hands-on exercises, group discussions, assignments and intensive Q&A sessions.

Live & Interactive

Ask questions, get clarifications, and engage in discussions with instructors and other participants.

Mentored By Industry Experts

Get mentored by Industry practitioners having more than 10 years of experience.

Reason Based Learning

Don’t gain just theoretical or practical knowledge. Understand the WHAT, WHY, and HOW of a subject. Simplify the subject matter and get in-depth comprehension.

Contents

  • Implement a solution that uses Cosmos DB, Data Lake Storage Gen2, or Blob storage
  • Implement data distribution and partitions
  • Implement a consistency model in Cosmos DB
  • Provision a non-relational data store
  • Provide access to data to meet security requirements
  • Implement for high availability, disaster recovery, and global distribution
  • configure elastic pools
  • configure geo-replication
  • provide access to data to meet security requirements
  • implement for high availability, disaster recovery, and global distribution
  •  implement data distribution and partitions for Azure Synapse Analytics
  • implement PolyBase
  • Implement data masking
  • Encrypt data at rest and in motion
  • Develop batch processing solutions by using Data Factory and Azure Databricks
  • Ingest data by using PolyBase
  • Implement the integration runtime for Data Factory
  • Implement Copy Activity within Azure Data Factory
  • Create linked services and datasets
  • Create pipelines and activities
  • Implement Mapping Data Flows in Azure Data Factory
  • Create and schedule triggers
  • Implement Azure Databricks clusters, notebooks, jobs, and autoscaling
  • Ingest data into Azure Databricks
  • Configure input and output
  • Select the appropriate windowing functions
  • Implement event processing by using Stream Analytics
  • Monitor relational and non-relational data sources
  • Implement Blob storage monitoring
  • Implement Data Lake Storage monitoring
  • Implement SQL Database monitoring
  • Implement Azure Synapse Analytics monitoring
  • Implement Cosmos DB monitoring
  • Configure Azure Monitor alerts
  • Implement auditing by using Azure Log Analytics
  • Monitor Data Factory pipelines
  • Monitor Azure Databricks
  • Monitor Stream Analytics
  • Configure Azure Monitor alerts
  • Implement auditing by using Azure Log Analytics
  • Troubleshoot data partitioning bottlenecks
  • Optimize Data Lake Storage
  • Optimize Stream Analytics
  • Optimize Azure Synapse Analytics
  • Optimize SQL Database
  • Manage the data lifecycle

DP-200 Virtual Training

  • 19th April to 20th May
  • 2 PM to 6 PM (8 hours)
  • FREE Limited Registrations
  • Microsoft Official Courseware
  • Proven Certification plan
View Schedules

Looking for customized training content? Kindly get in touch.

Contact Us