YoVDO

Cloud Composer: Copying BigQuery Tables Across Different Locations

Offered By: Google via Google Cloud Skills Boost

Tags

BigQuery Courses Data Warehousing Courses Apache Airflow Courses Cloud Storage Courses Cloud Composer Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
In this advanced lab you will create and run an Apache Airflow workflow in Cloud Composer that exports tables from a BigQuery dataset located in Cloud Storage buckets in the US to buckets in Europe, then import th0se tables to a BigQuery dataset in Europe.

Syllabus

  • GSP283
  • Overview
  • Setup
  • Task 1. Create Cloud Composer environment
  • Task 2. Create Cloud Storage buckets
  • Task 3. BigQuery destination dataset
  • Task 4. Airflow and core concepts, a brief introduction
  • Task 5. Defining the workflow
  • Task 6. Viewing environment information
  • Task 7. Setting DAGs Cloud Storage bucket
  • Task 8. Setting Airflow variables
  • Task 9. Uploading the DAG and dependencies to Cloud Storage
  • Task 10. Using the Airflow UI
  • Task 11. Validate the results
  • Delete Cloud Composer Environment
  • Congratulations!
  • Next steps

Tags

Related Courses

Architecting Microsoft Azure Solutions
Microsoft via edX
Computing, Storage and Security with Google Cloud Platform
Google via Coursera
Windows Server 2016: Azure for On-Premises Administrators
Microsoft via edX
Microsoft Professional Orientation : Cloud Administration
Microsoft via edX
IT Support: Troubleshooting Microsoft Office
Microsoft via edX