The Mathematical Universe Behind Deep Neural Networks - Rothschild Lecture
Offered By: Isaac Newton Institute for Mathematical Sciences via YouTube
Course Description
Overview
Embark on a captivating journey through the mathematical universe behind deep neural networks in this Rothschild Lecture delivered by Professor Helmut Bölcskei from ETH Zürich. Explore the theoretical underpinnings of deep neural networks, delving into functional analysis, harmonic analysis, complex analysis, approximation theory, dynamical systems, Kolmogorov complexity, optimal transport, and fractal geometry. Gain insights into the mathematical foundations that have led to breakthrough results in practical machine learning tasks such as image classification, image captioning, control-policy-learning for the board game Go, and protein structure prediction. Discover how these mathematical concepts contribute to the remarkable successes of deep neural networks in various applications.
Syllabus
Date: 10 December 2021 – 16:00 to
Taught by
Isaac Newton Institute for Mathematical Sciences
Related Courses
Sparse Representations in Signal and Image Processing: FundamentalsTechnion - Israel Institute of Technology via edX Filters and Other Potions for Early Vision and Recognition
MITCBMM via YouTube ADSI Summer Workshop- Algorithmic Foundations of Learning and Control, Pablo Parrilo
Paul G. Allen School via YouTube A Function Space View of Overparameterized Neural Networks - Rebecca Willet, University of Chicago
Alan Turing Institute via YouTube Approximation with Deep Networks - Remi Gribonval, Inria
Alan Turing Institute via YouTube