YoVDO

Weaving Together Machine Learning, Theoretical Physics, and Neuroscience

Offered By: Fields Institute via YouTube

Tags

Machine Learning Courses Neuroscience Courses Theoretical Physics Courses Deep Networks Courses High-dimensional Statistics Courses

Course Description

Overview

Explore the intersection of machine learning, theoretical physics, and neuroscience in this seminar by Stanford University's Surya Ganguli. Delve into high-dimensional statistics, deep network generalization, and the application of complex systems analysis to neural systems. Discover how optimal convolutional auto-encoders can reveal retinal structure and how recurrent neural networks explain hexagonal firing patterns. Examine the geometry and dynamics of high-dimensional optimization in quantum optimizers, and gain insights into the potential unification of these fields for developing advanced machine learning algorithms.

Syllabus

Introduction
Theoretical neuroscience and machine learning
Outline
Highdimensional statistics
Highdimensional regression
Algorithm examples
The answer
Running example
Generalization
Nonlinear deep networks
Linear deep networks
Nonlinear networks
Deeper networks
Random matrix theory
Rank 1 teacher
Qualitative conclusions
Revisiting generalization
Summary
Basic idea
Neuroscience
Cell types
Quantum neuromorphic computing
Quantum optimizers
Energy landscape
Papers
Questions
Deep Neural Network


Taught by

Fields Institute

Related Courses

K-Means and K-Medians Under Dimension Reduction
Simons Institute via YouTube
Can Non-Convex Optimization Be Robust?
Simons Institute via YouTube
Robust Estimation and Generative Adversarial Nets
Simons Institute via YouTube
Invariance, Causality and Novel Robustness
Simons Institute via YouTube
The Importance of Better Models in Stochastic Optimization
Simons Institute via YouTube