This course provides you with practical skills to build and manage data pipelines and Extract, Transform, Load (ETL) processes using shell scripts, Airflow and Kafka.
Well-designed and automated data pipelines and ETL processes are the foundation of a successful Business Intelligence platform. Defining your data workflows, pipelines and processes early in the platform design ensures the right raw data is collected, transformed and loaded into desired storage layers and available for processing and analysis as and when required.
This course is designed to provide you the critical knowledge and skills needed by Data Engineers and Data Warehousing specialists to create and manage ETL, ELT, and data pipeline processes.Read more.
This resource is offered by an affiliate partner. If you pay for training, we may earn a commission to support this site.
The techniques and tools covered in Building ETL and Data Pipelines with Bash, Airflow and Kafka are most similar to the requirements found in Data Engineer job advertisements.
Building ETL and Data Pipelines with Bash, Airflow and Kafka is a part of one structured learning path.
17 Courses
Free Data Engineer