Rust for Weld: High Performance Parallel JIT Compiler - RustConf 2019
Offered By: Rust via YouTube
Course Description
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a high-performance parallel JIT compiler in this RustConf 2019 talk by Shoumik Palkar. Dive into the open-source Rust project Weld, which accelerates data-intensive libraries and frameworks up to 100x. Learn about JIT-compiling a custom parallel intermediate representation, optimizing across functions and libraries, and achieving near-bare metal performance without compromising code modularity. Discover the challenges of building an extensible, performance-sensitive compiler using Rust, including native code generation with LLVM, reducing JIT compilation times, and leveraging Rust's trait system for extensibility and cross-language portability. Gain insights into developing a parallel runtime in Rust and addressing the unique challenges of calling unsafe Rust from JIT'd code. Follow the evolution of Weld's implementation, from its initial version to the latest iteration featuring a Rust parallel runtime, and understand how it tackles performance optimization, memory management, and system architecture.
Syllabus
Intro
Motivation for the Weld Project
How bad is this problem?
Weld: a common runtime for data libraries
Life of a Weld Program
Weld for building high performance systems
First Weld compiler implementation
Requirements
The search for a new language
Weld in Rust, v1.0: native compiler
IR implemented as tree with closed enum
Transformations with pattern matching
Performance note: living without clone
Unsafe LLVM API for code generation
Cargo to manage...everything
Weld in Rust, v2.0: Rust parallel runtime
Parallel runtime in Rust
Conclusion
Taught by
Rust
Related Courses
Rust for ProgrammersCodecademy Python and Rust with Linux Command Line Tools
Pragmatic AI Labs via edX Rust Data Engineering
Pragmatic AI Labs via edX Rust for DevOps
Pragmatic AI Labs via edX Rust for Large Language Model Operations (LLMOps)
Pragmatic AI Labs via edX