Quantifying the Uncertainty in Model Predictions Using Conformal Prediction
Offered By: Toronto Machine Learning Series (TMLS) via YouTube
Course Description
Overview
Explore the concept of conformal prediction in this 33-minute conference talk from the Toronto Machine Learning Series. Learn how to quantify uncertainty in neural network predictions and generate alternative outputs when models are unsure. Discover the versatility, statistical rigor, and simplicity of conformal prediction as a method applicable to both classification and regression tasks. Gain insights into its three-step implementation process and understand how it can be applied to real-world use cases. Presented by Jesse Cresswell, Senior Machine Learning Scientist at Layer 6 AI, this talk provides valuable knowledge for addressing the challenge of overconfident wrong predictions in neural networks.
Syllabus
Quantifying the Uncertainty in Model Predictions
Taught by
Toronto Machine Learning Series (TMLS)
Related Courses
علم اجتماع المايكروباتKing Saud University via Rwaq (رواق) Statistical Learning with R
Stanford University via edX More Data Mining with Weka
University of Waikato via Independent The Caltech-JPL Summer School on Big Data Analytics
California Institute of Technology via Coursera Machine Learning for Musicians and Artists
Goldsmiths University of London via Kadenze