Microsoft Logo

DP-750T00: Implement data engineering solutions using Azure Databricks Training

DP-750 covers end to end data engineering with Azure Databricks and Unity Catalog. You'll learn to design, build, secure, and optimize lakehouse solutions that work at enterprise scale. It's a practical course focused on the skills data engineers actually use on the job.

📘 Azure 🎓 Certification: YES 👥 256 Enrolled ⏱️ 4 Days 💼 Intermediate Level ⭐ 4.9 | 39 Reviews

Why Microtek Learning?

500+

Courses

10+ Years

Experience

95K+

Global Learners

Virtual Instructor-Led Training

$2199
| DP-750T00: Implement data engineering solutions us

Course Overview

This course teaches data engineers how to build and manage modern data platforms on Azure Databricks. You'll start with environment setup and architecture fundamentals, then move into implementing governance with Unity Catalog, building ingestion and transformation pipelines, and deploying production workloads. It follows the natural progression of how you'd build a data platform from scratch.

Mode of Training

🏫 Classroom 💻 Live Online 🧪 Blended 👨‍👩‍👧‍👦 Private Group

What you will learn

  • Configure and manage Azure Databricks environments and compute resources
  • Implement data governance and security using Unity Catalog
  • Design efficient data models for lakehouse architectures
  • Build scalable batch and streaming data ingestion pipelines
  • Transform, validate, and load data into analytics-ready formats
  • Deploy, monitor, and optimize production-grade data pipelines

Who Should Attend This Course?

  • Data Engineer
  • Cloud Engineer
  • Data Scientist

Prerequisites

Required

  • Basic understanding of data analytics concepts
  • Familiarity with SQL
  • Experience with Python (notebooks)
  • Understanding of data organization principles

Recommended

  • Knowledge of Azure cloud fundamentals
  • Familiarity with Azure Databricks workspace
  • Understanding of data engineering and data warehouse concepts
  • Basic knowledge of Microsoft Entra ID and Azure security
  • Familiarity with Git version control

📞 Talk to a Learning Advisor

Please enter Name
Please enter a valid email address.
Please enter a valid phone number in international format (e.g., +14155552671).
Please enter Message
Please agree to I agree to Terms & Privacy Policy*.
Please agree to I authorize Microtek Learning to contact me via Phone/Email*.

📘 DP-750T00: Implement data engineering solutions using Azure Databricks Outline

Explore Azure Databricks

  • Introduction
  • Get started with Azure Databricks
  • Identify Azure Databricks workloads
  • Understand key concepts
  • Data governance using Unity Catalog and Microsoft Purview
  • Exercise - Explore Azure Databricks

Understand Azure Databricks Architecture

  • Introduction
  • Understand Azure Databricks architecture
  • Understand Unity Catalog managed storage
  • Understand external storage
  • Understand default storage (serverless compute)

Understand Azure Databricks Integrations

  • Introduction
  • Integration with Microsoft Fabric
  • Integration with Power BI
  • Integration with Visual Studio Code
  • Integration with Power Platform
  • Integration with Copilot Studio
  • Integration with Microsoft Purview
  • Integration with Microsoft Foundry

Select and Configure Compute in Azure Databricks

  • Introduction
  • Choose appropriate compute type
  • Configure compute performance
  • Configure compute features
  • Install libraries
  • Configure compute access
  • Exercise - Select and Configure Compute

Create and Organize Objects in Unity Catalog

  • Introduction
  • Apply naming conventions
  • Create catalog
  • Create schema
  • Create tables and views
  • Create volumes
  • Implement DDL operations
  • Implement foreign catalog
  • Configure AI/BI Genie instructions
  • Exercise - Create and Organize Objects

Secure Unity Catalog Objects

  • Introduction
  • Understand query lifecycle
  • Implement access control strategies
  • Understand fine-grained access control
  • Implement row filtering and column masking
  • Access Azure Key Vault secrets
  • Authenticate data access with service principals
  • Authenticate resource access with managed identities
  • Exercise - Secure Unity Catalog Objects

Govern Unity Catalog Objects

  • Introduction
  • Create and preserve table definitions
  • Configure ABAC with tags and policies
  • Apply data retention policies
  • Set up and manage data lineage
  • Configure audit logging
  • Design secure Delta Sharing strategy
  • Exercise - Govern Unity Catalog Objects

Design and Implement Data Modeling with Azure Databricks

  • Introduction
  • Design ingestion logic and data source configuration
  • Choose a data ingestion tool
  • Choose a data table format
  • Design and implement a data partitioning scheme
  • Choose a slowly changing dimension (SCD) type
  • Implement a slowly changing dimension (SCD) type 2
  • Design and implement a temporal (history) table
  • Choose granularity based on requirements
  • Choose managed vs unmanaged tables
  • Design and implement a clustering strategy
  • Exercise - Design and Implement Data Modeling

Ingest Data into Unity Catalog

  • Introduction
  • Ingest data with Lakeflow Connect
  • Ingest data with notebooks
  • Ingest data with SQL methods
  • Ingest data with CDC feed
  • Ingest data with Spark Structured Streaming
  • Ingest data with Auto Loader
  • Ingest data with Lakeflow Spark Declarative Pipelines
  • Exercise - Ingest Data

Cleanse, Transform, and Load Data into Unity Catalog

  • Introduction
  • Profile data
  • Choose column data types
  • Resolve duplicates and nulls
  • Transform data with filters and aggregations
  • Transform data with joins and set operations
  • Transform data with denormalization and pivots
  • Load data with merge, insert, and append
  • Exercise - Cleanse, Transform, and Load Data

Implement and Manage Data Quality Constraints with Azure Databricks

  • Introduction
  • Implement validation checks
  • Implement data type checks
  • Detect and manage schema drift
  • Manage data quality with pipeline expectations
  • Exercise - Implement Data Quality Constraints

Design and Implement Data Pipelines with Azure Databricks

  • Introduction
  • Design order of operations for a pipeline
  • Choose notebook vs Lakeflow Pipelines
  • Design Lakeflow job logic
  • Design error handling in pipelines and jobs
  • Create pipeline with notebook
  • Create pipeline with Lakeflow Spark Declarative Pipelines
  • Exercise - Design and Implement Data Pipelines

Implement Lakeflow Jobs with Azure Databricks

  • Introduction
  • Create job setup and configuration
  • Configure job triggers
  • Schedule a job
  • Configure job alerts
  • Configure automatic restarts
  • Exercise - Implement Lakeflow Jobs

Implement Development Lifecycle Processes in Azure Databricks

  • Introduction
  • Apply Git version control best practices
  • Manage branching and pull requests
  • Implement testing strategy
  • Configure and package Databricks Asset Bundles (DABs)
  • Deploy bundle with Databricks CLI
  • Exercise - Implement Development Lifecycle Processes

Monitor, Troubleshoot, and Optimize Workloads in Azure Databricks

  • Introduction
  • Monitor and manage cluster consumption
  • Troubleshoot and repair Lakeflow Jobs
  • Troubleshoot Spark jobs and notebooks
  • Investigate caching, skewing, spilling, and shuffle
  • Implement log streaming with Azure Log Analytics
  • Exercise - Monitor, Troubleshoot, and Optimize Workloads

Still have questions?

Reach out to our learning advisors for personalized guidance on choosing the right course, group training, or enterprise packages.

📞 Talk to an Advisor

What You Get with Microtek Learning

Instructor-Led Excellence

  • Certified Instructor-led Training
  • Top Industry Trainers
  • Official Student Handbooks

Measurable Learning Outcomes

  • Pre- & Post-Training Assessments
  • Practice Tests
  • Exam-Oriented Curriculum

Real-World Skill Building

  • Hands-on Activities & Scenarios
  • Interactive Online Courses
  • Peer Collaboration (Not in self-paced)

Full Support & Perks

  • Exam Scheduling Support *
  • Learn & Earn Program *
  • Support from Certified Experts
  • Gov. & Private Pricing *

Our Clients

For over 10 years, Microtek Learning has helped organizations, leaders, students and professionals to reach their maximum potential. We have led the path by addressing their challenges and advancing their performances.

Actemium
US Dept of Defense
Education Advisory Board
GE Digital
Department of Homeland Security
Pacific Life
MetLife
AIG
Chase
DC Gov
Johnson & Johnson
William Osler Health System
Google

Our Awards

Microsoft Award

Microsoft Learning
Partner of the Year

Inc 5000

5000 List of the Fastest-Growing Private Companies in America

Top IT Training

Top IT Training Companies
(Multiple Years)

Why We Are Best To Choose?

Team Support

Professional Team Support

Our expert counseling team provides round-the-clock assistance with the best value offers.

Experienced Trainers

Experienced Trainers

Certified trainers with 5–15 years of real-world industry experience guide your learning.

Satisfaction Guarantee

100% Satisfaction Guarantee

We guarantee satisfaction with top-quality content and instructor delivery.

Real-World Experience

Real-World Experience

Train with industry projects and curricula aligned to current standards.

Best Price Guarantee

Best Price Guarantee

We promise the lowest pricing and best offers in the market.

Guaranteed to Run

Guaranteed to Run

All courses are assured to run on scheduled dates via all delivery methods.

Azure Learning Resources

Explore our collection of free resources to boost your Azure learning journey

Blogs

Azure Expert Blogs

Explore insights from industry experts to stay ahead in tech—dive into our Expert Blogs now!

Read Blogs
Talk to Advisor