YoVDO

Cloud Composer: Copying BigQuery Tables Across Different Locations

Offered By: Google via Google Cloud Skills Boost

Tags

BigQuery Courses Data Warehousing Courses Apache Airflow Courses Cloud Storage Courses Cloud Composer Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
In this advanced lab you will create and run an Apache Airflow workflow in Cloud Composer that exports tables from a BigQuery dataset located in Cloud Storage buckets in the US to buckets in Europe, then import th0se tables to a BigQuery dataset in Europe.

Syllabus

  • GSP283
  • Overview
  • Setup
  • Task 1. Create Cloud Composer environment
  • Task 2. Create Cloud Storage buckets
  • Task 3. BigQuery destination dataset
  • Task 4. Airflow and core concepts, a brief introduction
  • Task 5. Defining the workflow
  • Task 6. Viewing environment information
  • Task 7. Setting DAGs Cloud Storage bucket
  • Task 8. Setting Airflow variables
  • Task 9. Uploading the DAG and dependencies to Cloud Storage
  • Task 10. Using the Airflow UI
  • Task 11. Validate the results
  • Delete Cloud Composer Environment
  • Congratulations!
  • Next steps

Tags

Related Courses

SAP Business Warehouse powered by SAP HANA
SAP Learning
Relational Database Support for Data Warehouses
University of Colorado System via Coursera
Data Warehouse Concepts, Design, and Data Integration
University of Colorado System via Coursera
Business Intelligence Concepts, Tools, and Applications
University of Colorado System via Coursera
Design and Build a Data Warehouse for Business Intelligence Implementation
University of Colorado System via Coursera