Robust Learning of a Single Neuron - Bridging Computational Gaps Using Optimization
Offered By: Institute for Pure & Applied Mathematics (IPAM) via YouTube
Course Description
Overview
Explore a comprehensive lecture on robust learning of a single neuron, focusing on bridging computational gaps using optimization insights. Delve into recent findings on learning a neuron with ReLU and other activation functions in an agnostic setting, aiming for near-optimal mean square loss. Examine the key role of a surrogate stochastic convex optimization problem in achieving low sample and computational complexity while maintaining target error guarantees. Investigate local error bounds from optimization theory established under mild distributional assumptions, covering sub-exponential, heavy-tailed, and some discrete distributions. Discover the surprising independence of the error bound constant from problem dimension and its crucial impact on the results. Analyze generalizations to other activation functions, including the challenging case of unknown activation functions. Gain valuable insights into computational vs statistical gaps in learning and optimization through this in-depth presentation by Jelena Diakonikolas from the University of Wisconsin-Madison at IPAM's EnCORE Workshop.
Syllabus
Jelena Diakonikolas - Robust Learning of a Neuron: Bridging Computational Gaps Using Optimization
Taught by
Institute for Pure & Applied Mathematics (IPAM)
Related Courses
Statistical Machine LearningEberhard Karls University of Tübingen via YouTube The Information Bottleneck Theory of Deep Neural Networks
Simons Institute via YouTube Interpolation and Learning With Scale Dependent Kernels
MITCBMM via YouTube Statistical Learning Theory and Applications - Class 16
MITCBMM via YouTube Statistical Learning Theory and Applications - Class 6
MITCBMM via YouTube