YoVDO

Creating and Submitting Java Spark Jobs to Spark Clusters

Offered By: CodeWithYu via YouTube

Tags

Apache Spark Courses Big Data Courses Python Courses Java Courses Scala Courses Docker Courses Apache Airflow Courses Data Engineering Courses Distributed Computing Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Dive into an end-to-end data engineering project combining Apache Airflow, Docker, Spark Clusters, Scala, Python, and Java in this comprehensive video tutorial. Learn to create and submit Java Spark jobs to Spark clusters, set up the development environment, and build basic jobs using multiple programming languages. Follow along as the instructor demonstrates how to process data on a Spark cluster and view real-time results. Gain hands-on experience with essential tools and technologies in modern data engineering, including Docker containerization, Airflow workflow management, and Spark distributed computing. By the end of this tutorial, you'll have practical knowledge of creating, compiling, and executing Spark jobs across different programming languages, preparing you for real-world data engineering challenges.

Syllabus

Introduction
Creating The Spark Cluster and Airflow on Docker
Creating Spark Job with Python
Creating Spark Job with Scala
Building and Compiling Scala Jobs
Creating Spark Job with Java
Building and Compiling Java Jobs
Cluster computation results


Taught by

CodeWithYu

Related Courses

CS115x: Advanced Apache Spark for Data Science and Data Engineering
University of California, Berkeley via edX
Big Data Analytics
University of Adelaide via edX
Big Data Essentials: HDFS, MapReduce and Spark RDD
Yandex via Coursera
Big Data Analysis: Hive, Spark SQL, DataFrames and GraphFrames
Yandex via Coursera
Introduction to Apache Spark and AWS
University of London International Programmes via Coursera