The Quest for Adaptivity in Machine Learning - Comparing Popular Methods
Offered By: Institut des Hautes Etudes Scientifiques (IHES) via YouTube
Course Description
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the strengths and weaknesses of popular supervised learning algorithms in this 32-minute lecture by Francis Bach from INRIA, presented at the Institut des Hautes Etudes Scientifiques (IHES). Delve into the concept of "no free lunch theorems" and understand why there is no universal algorithm that performs well on all learning problems. Compare the performance of k-nearest-neighbor, kernel methods, and neural networks, examining their adaptivity, regularization, and optimization techniques. Investigate the curse of dimensionality, smoothness of prediction functions, and the role of latent variables in machine learning. Gain insights into the simplicity bias and overfitting issues associated with neural networks. Conclude with a comprehensive understanding of the trade-offs and considerations in choosing appropriate learning methods for different problem domains.
Syllabus
Intro
Supervised machine learning Classical formalization
Local averaging
Curse of dimensionality on X = Rd
Support of inputs
Smoothness of the prediction function
Latent variables
Need for adaptivity
From kernels to neural networks
Regularized empirical risk minimization
Adaptivity of kernel methods
Adaptivity of neural networks
Comparison of kernel and neural network regimes
Optimization for neural networks
Simplicity bias
Overfitting with neural networks
Conclusion
Taught by
Institut des Hautes Etudes Scientifiques (IHES)
Related Courses
Machine Learning: K-Nearest NeighborsCodecademy Machine Learning with Python
IBM via Cognitive Class Machine Learning with R
Cognitive Class Python Fundamentals and Data Science Essentials
Packt via Coursera Supervised Learning in R: Classification
DataCamp