18756 Stone Oak Park Way, Suite200, San Antonio TX 78258 USA
100 Queen St W, Brampton, ON L6X 1A4, Canada
country flagUnited States
share button

DP-200T01: Implementing an Azure Data Solution Associate (Data Engineer) Training


What DP-200T01: Implementing an Azure Data Solution Associate (Data Engineer) training is all about?

DP-200T01: Implementing an Azure Data Solution Associate (Data Engineer) Training provides individuals with comprehensive information and an in-depth explanation of ingesting, transforming, and egressing data from multiple sources using different tools and services. This technical course teaches one to design and implement the management, inspecting, privacy, and security of data with the help of a full stack of Azure services. It is suitable for Azure data engineers who want to improve their skills and build credibility. This course educates on operating with data storage and enabling group-based data science with Azure Databricks. By the end of the course, you will know how to develop globally distributed databases using Cosmos DB and execute real-time analytics with Stream Analytics.

This training is designed based on the objectives of the course variant DP-200T01-A

Schedule
  • Delivery Format:
Date: Nov 23, 2020 | 9:00 am - 5:00 pm EST
Location: Online
$1725 USD
  • Delivery Format:
Date: Dec 02, 2020 | 9:00 am - 5:30 pm EST
Location: Online
$1725 USD
  • Delivery Format:
Date: Dec 08, 2020 | 9:00 am - 5:30 pm EST
Location: Online
$1725 USD
  • Delivery Format:
Date: Dec 16, 2020 | 9:00 am - 5:30 pm EST
Location: Online
$1725 USD
  • Delivery Format:
Date: Dec 21, 2020 | 9:00 am - 5:30 pm EST
Location: Online
$1725 USD
  • Delivery Format:
Date: Dec 28, 2020 | 9:00 am - 5:30 pm EST
Location: Online
$1725 USD
What are the course objectives for DP-200T01: Implementing an Azure Data Solution Associate (Data Engineer) training?
  • Working with Data Storage
  • Enabling Team-Based Data Science with Azure Databricks
  • Building Globally Distributed Databases with Cosmos DB
  • Working with Relational Data Stores in the Cloud
  • Performing Real-Time Analytics with Stream Analytics
  • Orchestrating Data Movement with Azure Data Factory
  • Securing Azure Data Platforms
  • Monitoring and Troubleshooting Data Storage and Processing
  • Integrating and Optimizing Data Platforms
Who should attend DP-200T01: Implementing an Azure Data Solution Associate (Data Engineer) training?

The Azure Data Solution Training is aimed at business intelligence professionals, data professionals, and data architects planning to learn about various data platform technologies existing on Microsoft Azure. In addition to this, individuals who create apps that deliver data from different data platform technologies existing on Microsoft Azure.

What are the prerequisites for DP-200T01: Implementing an Azure Data Solution Associate (Data Engineer) training?

You must have Microsoft Azure Fundamentals (AZ-900) certification to take on DP-200 exam. You must have at least one years experience as data professional, business intelligence professional, or data architect.

What is the course outline for DP-200T01: Implementing an Azure Data Solution Associate (Data Engineer) training?
  • 1. Azure for the Data Engineer
  • This module explores how the world of data has evolved and how cloud data platform technologies are providing new opportunities for business to explore their data in different ways. The student will gain an overview of the various data platform technologies that are available, and how a Data Engineers role and responsibilities has evolved to work in this new world to an organization benefit

  • Explain the evolving world of data
  • Survey the services in the Azure Data Platform
  • Identify the tasks that are performed by a Data Engineer
  • Describe the use cases for the cloud in a Case Study
  • Lab: Azure for the Data Engineer

  • Identify the evolving world of data
  • Determine the Azure Data Platform Services
  • Identify tasks to be performed by a Data Engineer
  • Finalize the data engineering deliverables
  • After completing this module, students will be able to:

  • Explain the evolving world of data
  • Survey the services in the Azure Data Platform
  • Identify the tasks that are performed by a Data Engineer
  • Describe the use cases for the cloud in a Case Study
  • 2. Working with Data Storage
  • This module teaches the variety of ways to store data in Azure. The Student will learn the basics of storage management in Azure, how to create a Storage Account, and how to choose the right model for the data you want to store in the cloud. They will also understand how data lake storage can be created to support a wide variety of big data analytics solutions with minimal effort.

  • Choose a data storage approach in Azure
  • Create an Azure Storage Account
  • Explain Azure Data Lake storage
  • Upload data into Azure Data Lake
  • Lab: Working with Data Storage

  • Choose a data storage approach in Azure
  • Create a Storage Account
  • Explain Data Lake Storage
  • Upload data into Data Lake Store
  • After completing this module, students will be able to:

  • Choose a data storage approach in Azure
  • Create an Azure Storage Account
  • Explain Azure Data Lake Storage
  • Upload data into Azure Data Lake
  • 3. Enabling Team Based Data Science with Azure Databricks
  • This module introduces students to Azure Databricks and how a Data Engineer works with it to enable an organization to perform Team Data Science projects. They will learn the fundamentals of Azure Databricks and Apache Spark notebooks; how to provision the service and workspaces and learn how to perform data preparation task that can contribute to the data science project.

  • Explain Azure Databricks
  • Work with Azure Databricks
  • Read data with Azure Databricks
  • Perform transformations with Azure Databricks
  • Lab: Enabling Team Based Data Science with Azure Databricks

  • Explain Azure Databricks
  • Work with Azure Databricks
  • Read data with Azure Databricks
  • Perform transformations with Azure Databricks
  • After completing this module, students will be able to:

  • Explain Azure Databricks
  • Work with Azure Databricks
  • Read data with Azure Databricks
  • Perform transformations with Azure Databricks
  • 4. Building Globally Distributed Databases with Cosmos DB
  • In this module, students will learn how to work with NoSQL data using Azure Cosmos DB. They will learn how to provision the service, and how they can load and interrogate data in the service using Visual Studio Code extensions, and the Azure Cosmos DB .NET Core SDK. They will also learn how to configure the availability options so that users are able to access the data from anywhere in the world.

  • Create an Azure Cosmos DB database built to scale
  • Insert and query data in your Azure Cosmos DB database
  • Build a .NET Core app for Cosmos DB in Visual Studio Code
  • Distribute your data globally with Azure Cosmos DB
  • Lab: Building Globally Distributed Databases with Cosmos DB

  • Create an Azure Cosmos DB
  • Insert and query data in Azure Cosmos DB
  • Distribute data globally with Azure Cosmos DB
  • After completing this module, students will be able to:

  • Create an Azure Cosmos DB database built to scale
  • Insert and query data in your Azure Cosmos DB database
  • Distribute your data globally with Azure Cosmos DB
  • 5. Working with Relational Data Stores in the Cloud
  • In this module, students will explore the Azure relational data platform options including SQL Database and SQL Data Warehouse. The student will be able explain why they would choose one service over another, and how to provision, connect and manage each of the services.

  • Use Azure SQL Database
  • Describe Azure SQL Data Warehouse
  • Creating and Querying an Azure SQL Data Warehouse
  • Use PolyBase to Load Data into Azure SQL Data Warehouse
  • Lab: Working with Relational Data Stores in the Cloud

  • Use Azure SQL Database
  • Describe Azure SQL Data Warehouse
  • Creating and Querying an Azure SQL Data Warehouse
  • Use PolyBase to Load Data into Azure SQL Data Warehouse
  • After completing this module, students will be able to:

  • Use Azure SQL Database
  • Describe Azure Data Warehouse
  • Creating and Querying an Azure SQL Data Warehouse
  • Using PolyBase to Load Data into Azure SQL Data Warehouse
  • 6. Performing Real-Time Analytics with Stream Analytics
  • In this module, students will learn the concepts of event processing and streaming data and how this applies to Events Hubs and Azure Stream Analytics. The students will then set up a stream analytics job to stream data and learn how to query the incoming data to perform analysis of the data. Finally, you will learn how to manage and monitor running jobs.

  • Explain data streams and event processing
  • Data Ingestion with Event Hubs
  • Processing Data with Stream Analytics Jobs
  • Lab: Performing Real-Time Analytics with Stream Analytics

  • Explain data streams and event processing
  • Data Ingestion with Event Hubs
  • Processing Data with Stream Analytics Jobs
  • After completing this module, students will be able to:

  • Explain data streams and event processing
  • Data Ingestion with Event Hubs
  • Processing Data with Stream Analytics Jobs
  • 7. Orchestrating Data Movement with Azure Data Factory
  • In this module, students will learn how Azure Data factory can be used to orchestrate the data movement and transformation from a wide range of data platform technologies. They will be able to explain the capabilities of the technology and be able to set up an end to end data pipeline that ingests and transforms data.

  • Explain how Azure Data Factory works
  • Azure Data Factory Components
  • Azure Data Factory and Databricks
  • Lab: Orchestrating Data Movement with Azure Data Factory

  • Explain how Data Factory Works
  • Azure Data Factory Components
  • Azure Data Factory and Databricks
  • After completing this module, students will be able to:

  • Azure Data Factory and Databricks
  • Azure Data Factory Components
  • Explain how Azure Data Factory works
  • 8. Securing Azure Data Platforms
  • In this module, students will learn how Azure provides a multi-layered security model to protect your data. The students will explore how security can range from setting up secure networks and access keys, to defining permission through to monitoring across a range of data stores.

  • An introduction to security
  • Key security components
  • Securing Storage Accounts and Data Lake Storage
  • Securing Data Stores
  • Securing Streaming Data
  • Lab: Securing Azure Data Platforms

  • An introduction to security
  • Key security components
  • Securing Storage Accounts and Data Lake Storage
  • Securing Data Stores
  • Securing Streaming Data
  • After completing this module, students will be able to:

  • An introduction to security
  • Key security components
  • Securing Storage Accounts and Data Lake Storage
  • Securing Data Stores
  • Securing Streaming Data
  • 9. Monitoring and Troubleshooting Data Storage and Processing
  • In this module, the student will get an overview of the range of monitoring capabilities that are available to provide operational support should there be issue with a data platform architecture. They will explore the common data storage and data processing issues. Finally, disaster recovery options are revealed to ensure business continuity.

  • Explain the monitoring capabilities that are available
  • Troubleshoot common data storage issues
  • Troubleshoot common data processing issues
  • Manage disaster recovery
  • Lab: Monitoring and Troubleshooting Data Storage and Processing

  • Explain the monitoring capabilities that are available
  • Troubleshoot common data storage issues
  • Troubleshoot common data processing issues
  • Manage disaster recovery
  • After completing this module, students will be able to:

  • Explain the monitoring capabilities that are available
  • Troubleshoot common data storage issues
  • Troubleshoot common data processing issues
  • Manage disaster recovery
FAQ's
  • Single choice based on a scenario
  • Multiple-choice questions
  • Arrange in right sequence type questions
  • case study with multiple questions
  • There will be at least three questions in a sequence to select from Yes or No; you cant skip these questions
  • You need to get 700 out of 1000 to pass the DP-200 examination. The total duration of DP-200 exam is 210 minutes. Out of total time, 180 minutes are reserved to answer 46 questions.

    According to Microsoft, the skills ratio measured in DP-200 exam are:

  • Implement data storage solutions (40-45%)
  • Manage and develop data processing (25-30%)
  • Monitor and optimize data solutions (30-35%)
  • Our highly experienced Microsoft Azure certified trainers cover the following topics to impart the perfection in optimizing Azure data solutions:

  • Troubleshooting data partitioning bottlenecks
  • Optimize data lake storage; Optimize stream analytics
  • Optimize SQL data warehouse and database
  • Data life cycle management
  • 3 Days | $ 1725
    4.7
      274 Ratings

    1741 Learners

    Get In Touch

    Are you being sponsored by your employer to take this class?
    * I authorize Microtek Learning to contact me via Phone/Email