Cloud Composer: Copying BigQuery Tables Across Different Locations
Offered By: Google via Google Cloud Skills Boost
Course Description
Overview
In this advanced lab you will create and run an Apache Airflow workflow in Cloud Composer that exports tables from a BigQuery dataset located in Cloud Storage buckets in the US to buckets in Europe, then import th0se tables to a BigQuery dataset in Europe.
Syllabus
- GSP283
- Overview
- Setup
- Task 1. Create Cloud Composer environment
- Task 2. Create Cloud Storage buckets
- Task 3. BigQuery destination dataset
- Task 4. Airflow and core concepts, a brief introduction
- Task 5. Defining the workflow
- Task 6. Viewing environment information
- Task 7. Setting DAGs Cloud Storage bucket
- Task 8. Setting Airflow variables
- Task 9. Uploading the DAG and dependencies to Cloud Storage
- Task 10. Using the Airflow UI
- Task 11. Validate the results
- Delete Cloud Composer Environment
- Congratulations!
- Next steps
Tags
Related Courses
Serverless Data Analysis with Google BigQuery and Cloud Dataflow en FrançaisGoogle Cloud via Coursera Google Cloud Big Data and Machine Learning Fundamentals en Español
Google Cloud via Coursera Google Cloud Big Data and Machine Learning Fundamentals 日本語版
Google Cloud via Coursera Industrial IoT on Google Cloud
Google Cloud via Coursera Google Cloud Platform Big Data and Machine Learning Fundamentals em Português Brasileiro
Google Cloud via Coursera