Data Engineering
Offered By: Google via Qwiklabs
Course Description
Overview
This advanced-level quest is unique amongst the other Qwiklabs offerings. The labs have been curated to give IT professionals hands-on practice with topics and services that appear in the Google Cloud Certified Professional Data Engineer Certification. From Big Query, to Dataprep, to Cloud Composer, this quest is composed of specific labs that will put your Google Cloud data engineering knowledge to the test. Be aware that while practice with these labs will increase your skills and abilities, you will need other preparation, too. The exam is quite challenging and external studying, experience, and/or background in cloud data engineering is recommended. Looking for a hands on challenge lab to demonstrate your skills and validate your knowledge? On completing this quest, enroll in and finish the additional challenge lab at the end of the Engineer Data in the Google Cloud to receive an exclusive Google Cloud digital badge.
Syllabus
- Creating a Data Transformation Pipeline with Cloud Dataprep
- Cloud Dataprep by Trifacta is an intelligent data service for visually exploring, cleaning, and preparing structured and unstructured data for analysis. In this lab you will explore the Cloud Dataprep UI to build a data transformation pipeline.
- Building an IoT Analytics Pipeline on Google Cloud
- This lab shows you how to connect and manage devices using Cloud IoT Core; ingest the stream of information using Cloud Pub/Sub; process the IoT data using Cloud Dataflow; use BigQuery to analyze the IoT data. Watch this short video, Easily Build an IoT Analytics Pipeline.
- ETL Processing on Google Cloud Using Dataflow and BigQuery
- In this lab you will build several Data Pipelines that will ingest data from a publicly available dataset into BigQuery.
- Predict Visitor Purchases with a Classification Model in BQML
- In this lab you will use a newly available ecommerce dataset to run some typical queries that businesses would want to know about their customersâ purchasing habits.
- Cloud Composer: Copying BigQuery Tables Across Different Locations
- In this advanced lab you will create and run an Apache Airflow workflow in Cloud Composer that exports tables from a BigQuery dataset located in Cloud Storage bucktes in the US to buckets in Europe, then import th0se tables to a BigQuery dataset in Europe.
Tags
Related Courses
Advanced SQLKaggle Building Batch Data Pipelines on Google Cloud
Google Cloud via Coursera Building Batch Data Pipelines on GCP em Português Brasileiro
Google Cloud via Coursera Building Batch Data Pipelines on GCP auf Deutsch
Google Cloud via Coursera Building Batch Data Pipelines on GCP en Español
Google Cloud via Coursera