YoVDO

Creating and Submitting Scala Jobs to Spark Clusters Using Airflow

Offered By: CodeWithYu via YouTube

Tags

Apache Airflow Courses Python Courses Java Courses Scala Courses Docker Courses Apache Spark Courses Data Engineering Courses Cluster Computing Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Learn to create and submit Scala jobs to Spark clusters using Airflow in this comprehensive tutorial. Develop an end-to-end data engineering project combining Apache Airflow, Docker, Spark Clusters, Scala, Python, and Java. Create basic jobs in multiple programming languages, submit them to the Spark cluster for processing, and observe live results. Explore topics such as setting up Spark clusters and Airflow on Docker, creating Spark jobs in Python, Scala, and Java, building and compiling Scala and Java jobs, and analyzing cluster computation results. Gain hands-on experience in big data processing, workflow automation, and data engineering techniques using popular tools and frameworks.

Syllabus

Introduction
Creating The Spark Cluster and Airflow on Docker
Creating Spark Job with Python
Creating Spark Job with Scala
Building and Compiling Scala Jobs
Creating Spark Job with Java
Building and Compiling Java Jobs
Cluster computation results


Taught by

CodeWithYu

Related Courses

CS115x: Advanced Apache Spark for Data Science and Data Engineering
University of California, Berkeley via edX
Big Data Analytics
University of Adelaide via edX
Big Data Essentials: HDFS, MapReduce and Spark RDD
Yandex via Coursera
Big Data Analysis: Hive, Spark SQL, DataFrames and GraphFrames
Yandex via Coursera
Introduction to Apache Spark and AWS
University of London International Programmes via Coursera