Creating and Submitting Scala Jobs to Spark Clusters Using Airflow
Offered By: CodeWithYu via YouTube
Course Description
Overview
Learn to create and submit Scala jobs to Spark clusters using Airflow in this comprehensive tutorial. Develop an end-to-end data engineering project combining Apache Airflow, Docker, Spark Clusters, Scala, Python, and Java. Create basic jobs in multiple programming languages, submit them to the Spark cluster for processing, and observe live results. Explore topics such as setting up Spark clusters and Airflow on Docker, creating Spark jobs in Python, Scala, and Java, building and compiling Scala and Java jobs, and analyzing cluster computation results. Gain hands-on experience in big data processing, workflow automation, and data engineering techniques using popular tools and frameworks.
Syllabus
Introduction
Creating The Spark Cluster and Airflow on Docker
Creating Spark Job with Python
Creating Spark Job with Scala
Building and Compiling Scala Jobs
Creating Spark Job with Java
Building and Compiling Java Jobs
Cluster computation results
Taught by
CodeWithYu
Related Courses
Managing Big Data in Clusters and Cloud StorageCloudera via Coursera The Complete Apache Kafka Practical Guide
Udemy Dynamical Systems in Neuroscience
MITCBMM via YouTube Dimensionality Reduction II
MITCBMM via YouTube Optimizing Spark SQL Jobs with Parallel and Asynchronous IO
Databricks via YouTube