YoVDO

Cloud Composer: Copying BigQuery Tables Across Different Locations

Offered By: Google via Google Cloud Skills Boost

Tags

BigQuery Courses Data Warehousing Courses Apache Airflow Courses Cloud Storage Courses Cloud Composer Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
In this advanced lab you will create and run an Apache Airflow workflow in Cloud Composer that exports tables from a BigQuery dataset located in Cloud Storage buckets in the US to buckets in Europe, then import th0se tables to a BigQuery dataset in Europe.

Syllabus

  • GSP283
  • Overview
  • Setup
  • Task 1. Create Cloud Composer environment
  • Task 2. Create Cloud Storage buckets
  • Task 3. BigQuery destination dataset
  • Task 4. Airflow and core concepts, a brief introduction
  • Task 5. Defining the workflow
  • Task 6. Viewing environment information
  • Task 7. Setting DAGs Cloud Storage bucket
  • Task 8. Setting Airflow variables
  • Task 9. Uploading the DAG and dependencies to Cloud Storage
  • Task 10. Using the Airflow UI
  • Task 11. Validate the results
  • Delete Cloud Composer Environment
  • Congratulations!
  • Next steps

Tags

Related Courses

Advanced Data Engineering
Duke University via Coursera
BI Foundations with SQL, ETL and Data Warehousing
IBM via Coursera
Cloud Composer: Qwik Start - Command Line
Google via Google Cloud Skills Boost
Cloud Composer: Qwik Start - Console
Google via Google Cloud Skills Boost
Advanced Data Engineering
Pragmatic AI Labs via edX