YoVDO

The Key Equation Behind Probability - Entropy, Cross-Entropy, and KL Divergence

Offered By: Artem Kirsanov via YouTube

Tags

Probability Theory Courses Machine Learning Courses Neuroscience Courses Bayesian Statistics Courses Information Theory Courses Entropy Courses Probability Distributions Courses Kullback-Leibler Divergence Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the fundamental concepts of probability theory and its applications in neuroscience and machine learning in this 26-minute video. Delve into the intuitive idea of surprise and its relation to probability through real-world examples. Examine advanced topics such as entropy, cross-entropy, and Kullback-Leibler (KL) divergence. Learn how to measure the average surprise in a probability distribution, understand the loss of information when approximating distributions, and quantify differences between probability distributions. Gain insights into Bayesian and Frequentist approaches to probability, probability distributions, and the role of objective functions in cross-entropy minimization.

Syllabus

Introduction
Sponsor: NordVPN
What is probability Bayesian vs Frequentist
Probability Distributions
Entropy as average surprisal
Cross-Entropy and Internal models
Kullback–Leibler KL divergence
Objective functions and Cross-Entropy minimization
Conclusion & Outro


Taught by

Artem Kirsanov

Related Courses

Introduction to Statistics: Probability
University of California, Berkeley via edX
Aléatoire : une introduction aux probabilités - Partie 1
École Polytechnique via Coursera
Einführung in die Wahrscheinlichkeitstheorie
Johannes Gutenberg University Mainz via iversity
Combinatorics and Probability
Moscow Institute of Physics and Technology via Coursera
Probability
University of Pennsylvania via Coursera