Azure Data Engineers design and implement the management, monitoring, security, and privacy of data using the full stack of Azure data services to satisfy business needs.

The primary audience for this course is data professionals, data architects, and business intelligence professionals who want to learn about the data platform technologies that exist on Microsoft Azure.

The secondary audience for this course is individuals who develop applications that deliver content from the data platform technologies that exist on Microsoft Azure.

During this course you will learn how to design and implement various data platform technologies into solutions that are in line with business and technical requirements. This can include on-premises, cloud, and hybrid data scenarios which incorporate relational, No-SQL or Data Warehouse data. You will also learn how to design process architectures using a range of technologies for both streaming and batch data. In addition you will explore how to design and implement data security including authentication, authorization, data access, data policies and standards, and much more.

Learn in a Classroom

Anytime access                                                     ✓
Anywhere access to recorded lectures           ✓
Microsoft official training content                   ✓
In-depth training                                                 ✓
Hands-on labs                                                      ✓
Industry case studies                                          ✓
Ask instructor questions in person                 ✓
Attend live class in person                                ✓
Attend live class remotely                                 ✓
Time commitment                                             ✓
Mentorstag Experience
LEARN BY DOING

Immersive hands-on training with a combination of theoretical learning, hands-on exercises, group discussions, assignments and intensive Q&A sessions.

LIVE & INTERACTIVE

Ask questions, get clarifications, and engage in discussions with instructors and other participants.

MENTORED BY INDUSTRY EXPERTS

Get mentored by Industry practitioners having more than 10 years of experience.

REASON BASED LEARNING

Don’t gain just theoretical or practical knowledge. Understand the WHAT, WHY, and HOW of a subject. Simplify the subject matter and get in-depth comprehension.

CODE REVIEW BY PROFESSIONALS

Get reviews and timely feedback on your assignments and projects from professional developers.

BUILD PROJECTS

We emphasize on learning the concepts through examples and help you in building a portfolio of projects through the course of training.

LIFETIME ENROLMENT

Get reviews and timely feedback on your assignments and projects from professional developers.

CURRICULUM DESIGNED BY EXPERTS

The curriculum goes through multiple levels of design and preparation by the experts to keep the topics/modules relevant to everyday changes in technology.

STUDY EVEN FROM REMOTE LOCATIONS

Learn to use collaborative mediums to share opinions and improve your coding skills with assistance from the instructors and other participants.

Curriculum

Implement non-relational data stores
  • Implement a solution that uses Cosmos DB, Data Lake Storage Gen2, or Blob storage
  • Implement data distribution and partitions
  • Implement a consistency model in CosmosDB
  • Provision a non-relational data store
  • Provide access to data to meet security requirements
  • Implement for high availability, disaster recovery, and global distribution
Implement relational data stores
  • Configure elastic pools
  • Configure geo-replication
  • Provide access to data to meet security requirements
  • Implement for high availability, disaster recovery, and global distribution
  • Implement data distribution and partitions for SQL Data Warehouse
Manage data security
  • Implement data masking
  • Encrypt data at rest and in motion
Develop batch processing solutions
  • Develop batch processing solutions by using Data Factory and Azure Databricks
  • Implement the integration runtime for Data Factory
  • Create linked services and datasets
  • Create pipelines and activities
  • Create and schedule triggers
  • Implement Azure Databricks clusters, notebooks, jobs, and autoscaling
  • Ingest data into Azure Databricks
Develop streaming solutions
  • Configure input and output
  • Select the appropriate windowing functions
  • Implement event processing using Stream Analytics
Monitor data storage
  • Monitor relational and non-relational data sources
  • Implement BLOB storage monitoring
  • Implement Data Lake Store monitoring
  • Implement SQL Database monitoring
  • Implement SQL Data Warehouse monitoring
  • Implement Cosmos DB monitoring
  • Configure Azure Monitor alerts
  • Implement auditing by using Azure Log Analytics
Monitor data processing
  • Design and implement Data Factory monitoring
  • Monitor Azure Databricks
  • Monitor HDInsight processing
  • Monitor stream analytics
Optimize Azure data solutions
  • Troubleshoot data partitioning bottlenecks
  • Optimize Data Lake Storage
  • Optimize Stream Analytics
  • Optimize SQL Data Warehouse
  • Optimize SQL Database
  • Manage data life cycle
Recommend an Azure Data solution based on requirements
  • Choose the correct data storage solution to meet the technical and business requirements
  • Choose the partition distribution type
Design non-relational cloud data stores
  • Design data distribution and partitions
  • Design for scale, including multi-region, latency, and throughput
  • Design a solution that uses Cosmos DB, Data Lake Storage Gen2, or Blob storage
  • Select the appropriate Cosmos DB API
  • Design a disaster recovery strategy
  • Design for high availability
Design relational cloud data stores
  • Design data distribution and partitions
  • Design for scale, including multi-region, latency, and throughput
  • Design a solution that uses SQL Database and SQL Data Warehouse
  • Design a disaster recovery strategy
  • Design for high availability
Design batch processing solutions
  • Design batch processing solutions by using Data Factory and Azure Databricks
  • Identify the optimal data ingestion method for a batch processing solution
  • Identify where processing should take place, such as at the source, at the destination, or in transit
Design real-time processing solutions
  • Design for real-time processing by using Stream Analytics and Azure Databricks
  • Design and provision compute resources
Design security for source data access
  • Plan for secure endpoints
  • Choose the appropriate authentication mechanism, such as access keys, shared access signatures (SAS), and Azure Active Directory (Azure AD)
Design security for data policies and standards
  • Design for data encryption for data at rest and in transit
  • Design for data auditing and data masking
  • Design for data privacy and data classification
  • Design a data retention policy
  • Plan an archiving strategy
  • Plan to purge data based on business requirements
Express Route Connections
  • Express Route
  • Express Route Capabilities
  • Express Route Connections
  • Coexisting Site-to-Site and Express Route
Lab and Review Questions Knowledge check
  • Lab-VNet peering and service chaining
  • Module Review Questions