YoVDO

Quantifying the Uncertainty in Model Predictions Using Conformal Prediction

Offered By: Toronto Machine Learning Series (TMLS) via YouTube

Tags

Uncertainty Quantification Courses Data Science Courses Machine Learning Courses Neural Networks Courses Classification Courses Predictive Modeling Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the concept of conformal prediction in this 33-minute conference talk from the Toronto Machine Learning Series. Learn how to quantify uncertainty in neural network predictions and generate alternative outputs when models are unsure. Discover the versatility, statistical rigor, and simplicity of conformal prediction as a method applicable to both classification and regression tasks. Gain insights into its three-step implementation process and understand how it can be applied to real-world use cases. Presented by Jesse Cresswell, Senior Machine Learning Scientist at Layer 6 AI, this talk provides valuable knowledge for addressing the challenge of overconfident wrong predictions in neural networks.

Syllabus

Quantifying the Uncertainty in Model Predictions


Taught by

Toronto Machine Learning Series (TMLS)

Related Courses

Introduction to Artificial Intelligence
Stanford University via Udacity
Natural Language Processing
Columbia University via Coursera
Probabilistic Graphical Models 1: Representation
Stanford University via Coursera
Computer Vision: The Fundamentals
University of California, Berkeley via Coursera
Learning from Data (Introductory Machine Learning course)
California Institute of Technology via Independent