YoVDO

Creating and Submitting Java Spark Jobs to Spark Clusters

Offered By: CodeWithYu via YouTube

Tags

Apache Spark Courses Big Data Courses Python Courses Java Courses Scala Courses Docker Courses Apache Airflow Courses Data Engineering Courses Distributed Computing Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Dive into an end-to-end data engineering project combining Apache Airflow, Docker, Spark Clusters, Scala, Python, and Java in this comprehensive video tutorial. Learn to create and submit Java Spark jobs to Spark clusters, set up the development environment, and build basic jobs using multiple programming languages. Follow along as the instructor demonstrates how to process data on a Spark cluster and view real-time results. Gain hands-on experience with essential tools and technologies in modern data engineering, including Docker containerization, Airflow workflow management, and Spark distributed computing. By the end of this tutorial, you'll have practical knowledge of creating, compiling, and executing Spark jobs across different programming languages, preparing you for real-world data engineering challenges.

Syllabus

Introduction
Creating The Spark Cluster and Airflow on Docker
Creating Spark Job with Python
Creating Spark Job with Scala
Building and Compiling Scala Jobs
Creating Spark Job with Java
Building and Compiling Java Jobs
Cluster computation results


Taught by

CodeWithYu

Related Courses

Web Intelligence and Big Data
Indian Institute of Technology Delhi via Coursera
Big Data for Better Performance
Open2Study
Big Data and Education
Columbia University via edX
Big Data Analytics in Healthcare
Georgia Institute of Technology via Udacity
Data Mining with Weka
University of Waikato via Independent