The Mathematical Universe Behind Deep Neural Networks - Rothschild Lecture
Offered By: Isaac Newton Institute for Mathematical Sciences via YouTube
Course Description
Overview
Embark on a captivating journey through the mathematical universe behind deep neural networks in this Rothschild Lecture delivered by Professor Helmut Bölcskei from ETH Zürich. Explore the theoretical underpinnings of deep neural networks, delving into functional analysis, harmonic analysis, complex analysis, approximation theory, dynamical systems, Kolmogorov complexity, optimal transport, and fractal geometry. Gain insights into the mathematical foundations that have led to breakthrough results in practical machine learning tasks such as image classification, image captioning, control-policy-learning for the board game Go, and protein structure prediction. Discover how these mathematical concepts contribute to the remarkable successes of deep neural networks in various applications.
Syllabus
Date: 10 December 2021 – 16:00 to
Taught by
Isaac Newton Institute for Mathematical Sciences
Related Courses
Kolmogorov Complexity for DNA Sequences Analysis in PythonYacine Mahdid via YouTube Unexpected Hardness Results for Kolmogorov Complexity Under Uniform Reductions
Association for Computing Machinery (ACM) via YouTube Kolmogorov Music
Strange Loop Conference via YouTube Cryptography and Kolmogorov Complexity - A Quick Tutorial
Simons Institute via YouTube On One-way Functions and Kolmogorov Complexity
IEEE via YouTube