Cloud Composer: Qwik Start - Command Line
Offered By: Google via Google Cloud Skills Boost
Course Description
Overview
In this lab, you use Cloud Shell command line to create the Cloud Composer environment and set Composer environment variables. You then use Composer to run a workflow that verifies a data file, creates and runs an Apache Hadoop wordcount job on a Dataproc cluster, and deletes the cluster.
Syllabus
- GSP606
- Overview
- Setup and requirements
- What is Cloud Composer?
- What is Apache Airflow?
- What is Cloud Dataproc?
- Task 1. Sample workflow
- Task 2. Verify the Composer environment
- Task 3. Set up Apache Airflow environment variables
- Task 4. Upload Airflow files to Cloud storage
- Task 5. Using the Airflow web interface
- Delete Cloud Composer Environment
- Congratulations!
Tags
Related Courses
Introduction to Airflow in PythonDataCamp Building Data Engineering Pipelines in Python
DataCamp The Complete Hands-On Introduction to Apache Airflow
Udemy Apache Airflow: The Hands-On Guide
Udemy ETL and Data Pipelines with Shell, Airflow and Kafka
IBM via Coursera