The Elusive Generalization and Easy Optimization in Machine Learning - Part 1
Offered By: Institute for Pure & Applied Mathematics (IPAM) via YouTube
Course Description
Overview
Explore the fundamental concepts of generalization and optimization in machine learning through this comprehensive lecture by Misha Belkin from the University of California, San Diego. Delve into the recent developments and challenges in understanding generalization, particularly in the context of neural networks. Examine how empirical findings have necessitated a reevaluation of theoretical foundations. Gain insights into the optimization process using gradient descent and discover why large non-convex systems are surprisingly easy to optimize with local methods. Presented as part of IPAM's Mathematics of Intelligences Tutorials at UCLA, this 1-hour 21-minute talk offers a deep dive into the central topics of machine learning and data science, providing valuable knowledge for researchers and practitioners in the field.
Syllabus
Misha Belkin - The elusive generalization and easy optimization, Pt. 1 of 2 - IPAM at UCLA
Taught by
Institute for Pure & Applied Mathematics (IPAM)
Related Courses
Practical Machine LearningJohns Hopkins University via Coursera Practical Deep Learning For Coders
fast.ai via Independent 機器學習基石下 (Machine Learning Foundations)---Algorithmic Foundations
National Taiwan University via Coursera Data Analytics Foundations for Accountancy II
University of Illinois at Urbana-Champaign via Coursera Entraînez un modèle prédictif linéaire
CentraleSupélec via OpenClassrooms