Data Engineering on Azure
Course description
Microsoft Azure is a cloud platform offering over 200 products and services for data storage, management, virtual machine deployment, and application development in the cloud. Azure supports working with various frameworks and tools, allowing applications to run in a multi-cloud environment, locally, or at the network edge.
Read more about the course
What you will learn in the course
In this course guided by Kristian Bakarich, you will practically create a streaming data processing pipeline in Azure. As part of the project, you will learn to use key Azure services for processing Twitter data streams in JSON format, including:
- APIM (API Management) - for data intake,
- Blob Storage - for storage,
- Azure Functions - for processing,
- Cosmos DB - for storing processed data,
- Power BI - for data visualization.
Project Structure
- Introduction and Architecture
- Get acquainted with the overall solution architecture and key components of the pipeline.
- Data Creation and Sending
- Write a JSON file with messages, create a Python script to send JSON objects via HTTP requests to Azure API Management.
- Development and Deployment of Azure Functions
- Learn to create and deploy Azure functions in Python using Visual Studio Code, create a function project with basic logic.
- Service Integration
- Set up and integrate Event Hubs, Azure Functions, and Cosmos DB, learn to write messages from Event Hub to Cosmos DB.
- Data Visualization in Power BI
- Connect Power BI Desktop to the Cosmos DB for real-time data visualization.
Required Knowledge and Prerequisites
- An Azure account
- Basic programming skills (Python)
- Basic knowledge of working with data storage
- Basics of API (recommended course: "Designing and Developing APIs with FastAPI")
- Basics of working with message queues
Watch Online
Watch Online Data Engineering on Azure
All Course Lessons (14)
| # | Lesson Title | Duration | Access |
|---|---|---|---|
| 1 | Data Engineering in Azure - Streaming Data Pipelines Demo | 02:44 | |
| 2 | Introduction to Datasets and Local Preprocessing | 07:07 | |
| 3 | Deploying your Code on Visual Studio to Docker containers | 05:28 | |
| 4 | Develop Azure Functions via Python and VS Code | 05:53 | |
| 5 | Deploy Azure Function to Azure Function App and Test it | 06:27 | |
| 6 | Integrate Azure Function with Blob Storage via bindings | 04:59 | |
| 7 | Expose Azure Function as a Backend, and Test it from Insomnia | 07:06 | |
| 8 | Securely Store Secrets in Azure Key Vault and Connect APIM to Key Vault | 04:42 | |
| 9 | Add Basic authentication in API Management using Key Vault and Named Values | 04:36 | |
| 10 | Test APIM and Imported Azure Function App and Function via Local Python Program | 02:35 | |
| 11 | Create Event Hubs and Test Capture Events Feature | 07:00 | |
| 12 | Modify Existing Azure Function to Include Event Hubs Binding and Test It | 06:43 | |
| 13 | Create a Cosmos DB (Core SQL) and Create a New Azure Function that writes Messages to Cosmos DB | 09:04 | |
| 14 | Connect Power Bi Desktop via Connector, and create a dashboard | 06:33 |
Unlock unlimited learning
Get instant access to all 13 lessons in this course, plus thousands of other premium courses. One subscription, unlimited knowledge.
Learn more about subscriptionComments
0 commentsSimilar courses

Microsoft Azure Administrator (AZ-104)

Azure Data Pipelines with Terraform

Learning Apache Spark

Want to join the conversation?
Sign in to comment