YoVDO

Probability - Math for Machine Learning

Offered By: Weights & Biases via YouTube

Tags

Statistics & Probability Courses Machine Learning Courses Linear Algebra Courses Probability Theory Courses Loss Functions Courses Gaussian Distribution Courses

Course Description

Overview

Explore the fundamental concepts of probability essential for machine learning in this 45-minute video lecture. Delve into the challenges of mathematically rigorous probability theory and discover why negative logarithms of probabilities, known as "surprises," are prevalent in machine learning. Learn how probability behaves like mass, how surprises relate to loss functions, and why they are preferable to densities. Examine the connection between Gaussians, probability, and linear algebra. Access accompanying slides and exercise notebooks for hands-on practice. Gain valuable insights into the Math for Machine Learning series, with timestamps provided for easy navigation through key topics.

Syllabus

Introduction
Probability is subtle
Overview of takeaways
Probability is like mass
Surprises show up more often in ML
Surprises give rise to loss functions
Surprises are better than densities
Gaussians unite probability and linear algebra
Summary of the Math4ML ideas
Additional resources on Math4ML


Taught by

Weights & Biases

Related Courses

Introduction to Artificial Intelligence
Stanford University via Udacity
Natural Language Processing
Columbia University via Coursera
Probabilistic Graphical Models 1: Representation
Stanford University via Coursera
Computer Vision: The Fundamentals
University of California, Berkeley via Coursera
Learning from Data (Introductory Machine Learning course)
California Institute of Technology via Independent