Robust Learning of a Single Neuron - Bridging Computational Gaps Using Optimization
Offered By: Institute for Pure & Applied Mathematics (IPAM) via YouTube
Course Description
Overview
Explore a comprehensive lecture on robust learning of a single neuron, focusing on bridging computational gaps using optimization insights. Delve into recent findings on learning a neuron with ReLU and other activation functions in an agnostic setting, aiming for near-optimal mean square loss. Examine the key role of a surrogate stochastic convex optimization problem in achieving low sample and computational complexity while maintaining target error guarantees. Investigate local error bounds from optimization theory established under mild distributional assumptions, covering sub-exponential, heavy-tailed, and some discrete distributions. Discover the surprising independence of the error bound constant from problem dimension and its crucial impact on the results. Analyze generalizations to other activation functions, including the challenging case of unknown activation functions. Gain valuable insights into computational vs statistical gaps in learning and optimization through this in-depth presentation by Jelena Diakonikolas from the University of Wisconsin-Madison at IPAM's EnCORE Workshop.
Syllabus
Jelena Diakonikolas - Robust Learning of a Neuron: Bridging Computational Gaps Using Optimization
Taught by
Institute for Pure & Applied Mathematics (IPAM)
Related Courses
Automata TheoryStanford University via edX Introduction to Computational Thinking and Data Science
Massachusetts Institute of Technology via edX 算法设计与分析 Design and Analysis of Algorithms
Peking University via Coursera How to Win Coding Competitions: Secrets of Champions
ITMO University via edX Introdução à Ciência da Computação com Python Parte 2
Universidade de São Paulo via Coursera