YoVDO

Computation in Very Wide Neural Networks

Offered By: Simons Institute via YouTube

Tags

Neural Networks Courses Deep Learning Courses Bayesian Inference Courses Gaussian Processes Courses

Course Description

Overview

Explore the computational aspects of extremely wide neural networks in this 49-minute lecture by Yasaman Bahri from Google Brain. Delve into topics such as single-hidden layer neural networks of infinite width, Gaussian Process (GP) correspondences, and Bayesian inference with GP priors. Examine experimental results comparing neural network Gaussian process (NNGP) performance across hyperparameters, large depth behavior, and fixed points. Analyze phase diagrams, performance trends with width and dataset size, and empirical comparisons of various NN-GPs. Investigate the dynamics occurring in parameter space and gain insights into the best-performing networks, comparing GPs and SGD-trained neural networks. Part of the Frontiers of Deep Learning series at the Simons Institute.

Syllabus

Intro
Outline
Starting point
Single-hidden layer (shallow) neural networks of infinite width Consider a NN which
What GP does it correspond to?
Properties of the NNGP
Bayesian inference with a GP prior (Review)
Experiments from original work
Performance comparison
NNGP performance across hyperparameters
Large depth behavior & fixed points
Phase diagrams: experiments vs. theory
Performance trends with width and dataset size
Empirical comparison of various NN-GPS
Empirical trends
Best performing networks: comparison between GPs and SGD-NNS
Partway summary
What dynamics occurs in parameter space?
Closing Remarks


Taught by

Simons Institute

Related Courses

Statistical Shape Modelling: Computing the Human Anatomy
University of Basel via FutureLearn
Stochastic processes
Higher School of Economics via Coursera
Introduction to Scientific Machine Learning
Purdue University via edX
Machine Learning 1 - 2020
YouTube
Signals and Systems II
METUopencouseware via YouTube