YoVDO

The Elusive Generalization and Easy Optimization in Machine Learning - Part 1

Offered By: Institute for Pure & Applied Mathematics (IPAM) via YouTube

Tags

Machine Learning Courses Data Science Courses Neural Networks Courses Gradient Descent Courses Interpolation Courses Overfitting Courses Generalization Courses Statistical Learning Theory Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the fundamental concepts of generalization and optimization in machine learning through this comprehensive lecture by Misha Belkin from the University of California, San Diego. Delve into the recent developments and challenges in understanding generalization, particularly in the context of neural networks. Examine how empirical findings have necessitated a reevaluation of theoretical foundations. Gain insights into the optimization process using gradient descent and discover why large non-convex systems are surprisingly easy to optimize with local methods. Presented as part of IPAM's Mathematics of Intelligences Tutorials at UCLA, this 1-hour 21-minute talk offers a deep dive into the central topics of machine learning and data science, providing valuable knowledge for researchers and practitioners in the field.

Syllabus

Misha Belkin - The elusive generalization and easy optimization, Pt. 1 of 2 - IPAM at UCLA


Taught by

Institute for Pure & Applied Mathematics (IPAM)

Related Courses

Data Analysis
Johns Hopkins University via Coursera
Computing for Data Analysis
Johns Hopkins University via Coursera
Scientific Computing
University of Washington via Coursera
Introduction to Data Science
University of Washington via Coursera
Web Intelligence and Big Data
Indian Institute of Technology Delhi via Coursera