Databricks (Azure)
Course description
Course objectives and scope
The Databricks course (Azure preferred) provides comprehensive preparation for working with the Databricks platform – from fundamentals to more advanced features related to data management, job orchestration, and administration. Participants will learn Databricks architecture, cost models, machine types, as well as SQL Warehouse and Unity Catalog functionalities. Through hands-on exercises, participants will gain practical skills for efficiently using Databricks in everyday data analysis and integration processes.
Course goals
- Understanding Databricks architecture and cost models.
- Mastering the basics of working with notebooks, DBFS, and the dbutils module.
- Learning the data catalog and Delta format (partitioning, Optimize, VACUUM, Time Travel).
- Ability to create and parameterize SQL scripts in SQL Warehouse.
- Understanding the Catalyst engine and table statistics.
- Learning job orchestration and Delta Live Tables (ELT).
- Integration with tools such as Fivetran and configuration of jobs and pipelines.
- Managing permissions, resource pools, and policies.
- Practical knowledge of Unity Catalog and access administration.
Book the course
- Format: Remote
- Language: Polish
- Type: Public course, guaranteed
- Date: 24-27.02.2026
- Duration: 4 days (4h/day)
- Trainer: Piotr Chudzik
Net price per participant. Guaranteed courses require only one participant.
Benefits for participants
- Ability to effectively use Databricks with large datasets.
- Hands-on experience with notebooks, SQL Warehouse, and Delta Lake.
- Understanding cost and performance optimization mechanisms.
- Knowledge of advanced data management and process orchestration features.
- Experience with Databricks integration with other tools (e.g. Fivetran).
- Ability to configure permissions and work with Unity Catalog.
- Preparation for daily work and administration of Databricks in Azure.
Target audience
- Data analysts and data engineers.
- ETL specialists and BI developers.
- System administrators working in cloud environments.
- Professionals responsible for data integration and large-scale data management.
- Teams handling ELT orchestration and Databricks administration.
Course agenda
Introduction
- What Databricks is
- Cost breakdown: DBU vs cloud
- Compute types: job compute, all-purpose compute, SQL Warehouse
Core concepts
- Notebook introduction
- DBFS overview
- dbutils.fs module
- Widgets
Data Catalog
- Introduction to Delta format
- Time Travel
- Data partitioning
- Optimize oraz VACUUM
- Data management models
SQL Warehouse
- Creating SQL scripts
- Script parameterization
- Catalyst engine and table statistics
Jobs & Pipelines
- Module overview
- Job orchestration
- Delta Live Tables (ELT)
- Fivetran connections
Permissions and access
- Pools and policies
- Admin panel
- Introduction to Unity Catalog
- Granting permissions
- Creating connections and assigning access
No budget available? Get funding!
A program that allows you to quickly and easily obtain funding for courses for individual participants.
Why a guaranteed course?
- Guaranteed delivery the course takes place regardless of the number of participants.
- Knowledge and experience exchange with specialists from other industries.
- Interactive, live-led sessions not just theory, but also practical exercises and discussions.
- Flexible remote format join from anywhere.
Need Help?
Reach out to learn more about our team and the kinds of tailored solutions we can offer your organization.
Get in Touch