Linear Algebra - Math for Machine Learning
Offered By: Weights & Biases via YouTube
Course Description
Overview
Dive into the fundamental concepts of linear algebra essential for machine learning in this 41-minute video lecture. Explore how linear algebra differs from traditional algebra and resembles programming, gaining insights into its crucial role in machine learning. Learn about arrays as optimizable function representations, linear functions, and the concept of "refactoring" in linear algebra. Discover the Singular Value Decomposition (SVD) as a generic matrix refactoring tool and its applications in machine learning. Access accompanying slides and exercise notebooks to reinforce your understanding. Perfect for those seeking to strengthen their mathematical foundation for machine learning applications.
Syllabus
Introduction
Why care about linear algebra?
Linear algebra is not like algebra
Linear algebra is more like programming
Arrays are an optimizable representation of functions
Arrays represent linear functions
"Refactoring" shows up in linear algebra
Any function can be refactored
The SVD is the generic refactor applied to a matrix
Using the SVD in ML
Review of takeaways and more resources
Taught by
Weights & Biases
Related Courses
Introduction to Artificial IntelligenceStanford University via Udacity Natural Language Processing
Columbia University via Coursera Probabilistic Graphical Models 1: Representation
Stanford University via Coursera Computer Vision: The Fundamentals
University of California, Berkeley via Coursera Learning from Data (Introductory Machine Learning course)
California Institute of Technology via Independent