DP-200: Implementing an Azure Data Solution


DP-200: Implementing an Azure Data Solution


Course Duration: 3 days
Course Fee: Contact Us
Training Mode: On-Demand


About this course:

In this course DP-200: Implementing an Azure Data Solution , the students will implement various data platform technologies into solutions that are in line with business and technical requirements including on-premises, cloud, and hybrid data scenarios incorporating both relational and No-SQL data. They will also learn how to process data using a range of technologies and languages for both streaming and batch data.

The students will also explore how to implement data security including authentication, authorization, data policies and standards. They will also define and implement data solution monitoring for both the data storage and data processing activities. Finally, they will manage and troubleshoot Azure data solutions which includes the optimization and disaster recovery of big data, batch processing and streaming data solutions.

Course Outline:

Module 01: Azure for the Data Engineer

  • L01 – Explain the evolving world of data
  • L02 – Survey the services in the Azure Data Platform
  • L03 – Identify the tasks that are performed by a Data Engineer
  • L04 – Describe the use cases for the cloud in a Case Study

Module 2: Working with Data Storage

  • L01 – Choose a data storage approach in Azure
  • L02 – Create an Azure Storage Account
  • L03 – Explain Azure Data Lake Storage
  • L04 – Upload data into Azure Data Lake

Module 03: Enabling Team Based Data Science with Azure Databricks

  • L01 – Explain Azure Databricks
  • L02 – Work with Azure Databricks
  • L03 – Read data with Azure Databricks
  • L04 – Perform transformations with Azure Databricks.

Module 04: Building Globally Distributed Databases with Cosmos DB

  • L01 – Create an Azure Cosmos DB database built to scale
  • L02 – Insert and query data in your Azure Cosmos DB database
  • L03 – Build a .NET Core app for Azure Cosmos DB in Visual Studio Code
  • L04 – Distribute your data globally with Azure Cosmos DB

Module 05: Working with Relational Data Stores in the Cloud

  • L01 – Explain SQL Database and
  • L02 – Explain SQL Data Warehouse
  • L03 – Provision and load data in Azure SQL Data Warehouse
  • L04 – Import data into Azure SQL Data Warehouse using PolyBase

Module 06: Performing Real-Time Analytics with Stream Analytics

  • L01 – Explain data streams and event processing
  • L02 – Data Ingestion with Event Hubs
  • L03 – Processing Data with Stream Analytics Jobs

Module 07: Orchestrating Data Movement with Azure Data Factory

  • L01 – Explain how Azure Data Factory works
  • L02 – Create Linked Services and Datasets
  • L03 – Create Pipelines and Activities
  • L04 – Azure Data Factory pipeline execution and triggers.

Module 08: Securing Azure Data Platforms

  • L01 – Introduction to Security
  • L02 – Key Security Components
  • L03 – Securing Storage Accounts and Data Lake Storage
  • L04 – Security Data Stores
  • L05 – Securing Streaming Data

Module 09: Monitoring and Troubleshooting Data Storage and Processing

  • L01 – Explain the monitoring capabilities that are available
  • L02 – Troubleshoot common data storage issues
  • L03 – Troubleshoot common data processing issues
  • L04 – Manage disaster recovery

Leave a Reply

Your email address will not be published. Required fields are marked *

Scroll to Top