YoVDO

Interplay of Linear Algebra, Machine Learning, and HPC - JuliaCon 2021 Keynote

Offered By: The Julia Programming Language via YouTube

Tags

Linear Algebra Courses Machine Learning Courses High Performance Computing Courses Sketching Courses Kernel Methods Courses

Course Description

Overview

Explore the interplay between linear algebra, machine learning, and high-performance computing in this keynote address from JuliaCon 2021. Delve into the use of hierarchical matrix algebra for constructing low-complexity linear solvers and preconditioners, and learn how these fast solvers can accelerate large-scale PDE-based simulations and AI algorithms. Discover how statistical and machine learning methods can optimize solver selection and configuration. Examine recent developments in fast algebraic and geometric algorithms, including sketching and approximate nearest neighbor techniques. Gain insights into the STRUMPACK library and its applications. Investigate the use of Bayesian optimization in improving linear algebra computations. Engage with a Q&A session covering topics such as linear algebra code development, performance optimization using machine learning, and the potential of Julia in high-performance computing.

Syllabus

Welcome!.
Introduction by the speaker.
Acknowledgments.
Algebraic solvers are fundamental tools.
Mathematical libraries in which development we were involved.
Two main themes of the talk.
Kernel methods in ML.
Kernel Ridge Regression (KRR).
Solving large sense linear systems.
Low-rank compression.
Classes of low-rank structured matrices.
Cluster tree of matrix.
Fast algebraic algorithm: sketching.
Problem: we don't know the target rank.
Stochastic norm estimation.
Example: compression of HSS matrix.
Fast geometric algorithm: approximate nearest neighbor.
Approximate nearest neighbor with iterative merging.
Comparison of algebraic and geometric algorithms.
STRUMPACK (STRUctured Matrix PACKage).
Linear algebra and machine learning.
Bayesian optimization.
Modeling phase.
Search phase.
Parallelization of code execution.
Examples of ML improved linear algebra computations.
Summary.
Q&A: What do we need more: linear algebra code for new architectures or for new applications?.
Q&A: How we can give users the ability to use ML to get performance?.
Q&A: What developments do you want to see in the Julia ecosystem?.
Q&A: What high-performance algorithms can make use of specific code generation?.
Q&A: Do you think that Julia can replace C++ as the language for linear algebra?.
Q&A: Do you search for rank revealing LU?.
Announcements.


Taught by

The Julia Programming Language

Related Courses

Introduction to Artificial Intelligence
Stanford University via Udacity
Natural Language Processing
Columbia University via Coursera
Probabilistic Graphical Models 1: Representation
Stanford University via Coursera
Computer Vision: The Fundamentals
University of California, Berkeley via Coursera
Learning from Data (Introductory Machine Learning course)
California Institute of Technology via Independent