YoVDO

Infinite Limits and Scaling Laws of Neural Networks

Offered By: Institute for Pure & Applied Mathematics (IPAM) via YouTube

Tags

Neural Networks Courses Deep Learning Courses Computer Vision Courses Kernel Methods Courses Scaling Laws Courses Representation Learning Courses Generalization Courses Dynamical Mean Field Theory Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the cutting-edge research on neural network scaling laws and infinite parameter limits in this one-hour lecture by Blake Bordelon from Harvard University. Delivered at IPAM's Theory and Practice of Deep Learning Workshop, the talk delves into the breakthroughs in computer vision and natural language processing enabled by scaling up deep learning models. Examine infinite parameter limits of deep neural networks that preserve representation learning and understand the convergence rate of finite models to these limits. Discover how dynamical mean field theory methods provide an asymptotic description of learning dynamics in infinite width and depth networks. Investigate the proximity of finite network training dynamics to idealized limits through empirical analysis. Gain insights into a theoretical model of neural scaling laws that describes generalization dependence on training time, model size, and data quantity. Learn about compute-optimal scaling strategies, spectral properties of limiting kernels, and how representation learning can improve neural scaling laws. Understand the potential for doubling the training-time exponent in very hard tasks compared to the static kernel limit.

Syllabus

Blake Bordelon - Infinite limits and scaling laws of neural networks - IPAM at UCLA


Taught by

Institute for Pure & Applied Mathematics (IPAM)

Related Courses

Deep Learning Essentials
University of Pennsylvania via Coursera
Statistical Learning
Illinois Institute of Technology via Coursera
機器學習技法 (Machine Learning Techniques)
National Taiwan University via Coursera
Utilisez des modèles supervisés non linéaires
CentraleSupélec via OpenClassrooms
Generalization Theory in Machine Learning
Institute for Pure & Applied Mathematics (IPAM) via YouTube