Engineer Data in Google Cloud
Offered By: Google via Qwiklabs
Course Description
Overview
Earn a skill badge by completing the Engineer Data in Google Cloud quest, where you will learn how to: 1. Build data pipelines using Cloud Dataprep by Trifacta, Pub/Sub, and Dataflow. 2. Use Cloud IoT Core to collect and manage MQTT-based devices. 3. Use Cloud Storage, Dataflow, and BigQuery to perform ETL. 4. Build a machine learning model using BigQuery ML. 5. Use Cloud Composer to copy data across multiple locations. This quest is a great resource for understanding topics that will appear in the Google Cloud Certified Professional Data Engineer Certification. A skill badge is an exclusive digital badge issued by Google Cloud in recognition of your proficiency with Google Cloud products and services and tests your ability to apply your knowledge in an interactive hands-on environment. Complete the skill badge quest, and final assessment challenge lab, to receive a digital badge that you can share with your network.
Syllabus
- Creating a Data Transformation Pipeline with Cloud Dataprep
- Cloud Dataprep by Trifacta is an intelligent data service for visually exploring, cleaning, and preparing structured and unstructured data for analysis. In this lab you will explore the Cloud Dataprep UI to build a data transformation pipeline.
- Building an IoT Analytics Pipeline on Google Cloud
- This lab shows you how to connect and manage devices using Cloud IoT Core; ingest the stream of information using Cloud Pub/Sub; process the IoT data using Cloud Dataflow; use BigQuery to analyze the IoT data. Watch this short video, Easily Build an IoT Analytics Pipeline.
- ETL Processing on Google Cloud Using Dataflow and BigQuery
- In this lab you will build several Data Pipelines that will ingest data from a publicly available dataset into BigQuery.
- Predict Visitor Purchases with a Classification Model in BQML
- In this lab you will use a newly available ecommerce dataset to run some typical queries that businesses would want to know about their customersâ purchasing habits.
- Cloud Composer: Copying BigQuery Tables Across Different Locations
- In this advanced lab you will create and run an Apache Airflow workflow in Cloud Composer that exports tables from a BigQuery dataset located in Cloud Storage bucktes in the US to buckets in Europe, then import th0se tables to a BigQuery dataset in Europe.
- Engineer Data in Google Cloud: Challenge Lab
- This challenge lab tests your skills and knowledge from the labs in the Engineer Data in Google Cloud quest. You should be familiar with the content of labs before attempting this lab.
Tags
Related Courses
Cloud Composer: Copying BigQuery Tables Across Different LocationsGoogle Cloud via Coursera Cloud Composer: Qwik Start - Command Line
Google Cloud via Coursera ML Pipelines on Google Cloud
Google Cloud via Coursera ML Pipelines on Google Cloud
Pluralsight Data Engineering
Google via Qwiklabs