Cloud Composer: Qwik Start - Console
Offered By: Google via Google Cloud Skills Boost
Course Description
Overview
In this lab, you create a Cloud Composer environment using the GCP Console. You then use the Airflow web interface to run a workflow that verifies a data file, creates and runs an Apache Hadoop wordcount job on a Dataproc cluster, and deletes the cluster.
Syllabus
- GSP261
- Overview
- Setup and requirements
- Task 1. Create Cloud Composer environment
- Task 2. Airflow and core concepts
- Task 3. Defining the workflow
- Task 4. Viewing environment information
- Task 5. Using the Airflow UI
- Task 6. Setting Airflow variables
- Task 7. Uploading the DAG to Cloud Storage
- Task 8. Test your knowledge
- Delete Cloud Composer Environment
- Congratulations!
- Next steps
Tags
Related Courses
Introduction to Airflow in PythonDataCamp Building Data Engineering Pipelines in Python
DataCamp The Complete Hands-On Introduction to Apache Airflow
Udemy Apache Airflow: The Hands-On Guide
Udemy ETL and Data Pipelines with Shell, Airflow and Kafka
IBM via Coursera