Databricks is one of the most popular platforms for data processing using Apache Spark and for creating modern data warehouses (Lakehouse). In this course, you will learn everything you need for a confident start with Databricks, from the basics of the platform to creating your own pipelines and connecting BI tools.
You will learn how Databricks works, why to use it, create your notebooks, set up a computing cluster, and get acquainted with Databricks SQL Warehouse.
1. Installation and Data Preparation
Before getting started with practical work, you will set up Databricks on AWS, create an S3 bucket for data storage, and set up a workspace. You will also examine the AWS CloudFormation template that Databricks uses to understand how the infrastructure is automatically deployed.
You will review the created cluster and become familiar with the dataset on which you will build your ETL process.
2. Practice: Data Processing
You will learn two ways to load data into Databricks: directly or through S3 followed by integration. You will also learn how to create code repositories. This can be done in two ways: connect a GitHub repository or create a repository manually right in Databricks.
During the project, you will complete two key tasks:
- ETL Data Processing: run the pipeline, perform transformation, create tables, and save them in Databricks.
- Data Visualization: perform analysis with Spark SQL in a separate notebook and create visualizations.
You will also learn how data is stored within Databricks.
3. Data Warehouse and External Integrations
Finally, you will connect Power BI to Databricks and try both integration methods: through a compute cluster and through SQL Warehouse. This way, you will learn how to integrate Databricks with external analytics tools.
Recommendations Before Starting
Before starting this course, it is recommended to complete the "Apache Spark Basics" course. With these foundational skills, you will be able to work effectively in Databricks.
Requirements:
- AWS account
- Databricks account
- Knowledge of basic Spark (at the level of the "Spark Fundamentals" course)
- Minimal costs on AWS (especially within the free tier)