YoVDO

Mehler's Formula, Branching Processes, and Compositional Kernels of Deep Neural Networks

Offered By: VinAI via YouTube

Tags

Neural Networks Courses Machine Learning Courses Eigenvalues Courses Activation Functions Courses Generalization Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the intricate connections between compositional kernels, branching processes, and deep neural networks in this 53-minute seminar presented by VinAI. Delve into Hai Tran-Bach's research, which utilizes Mehler's formula to provide novel insights into the mathematical role of activation functions in neural networks. Examine the unscaled and rescaled limits of compositional kernels, investigating their behavior as compositional depth increases. Analyze the memorization capacity of compositional kernels and neural networks, focusing on the interplay between compositional depth, sample size, dimensionality, and activation non-linearity. Discover explicit formulas for eigenvalues of compositional kernels, quantifying the complexity of corresponding Reproducing Kernel Hilbert Spaces (RKHS). Learn about a new random features algorithm that compresses compositional layers through an innovative activation function. Gain valuable insights into the mathematical foundations of deep learning and their implications for developing more principled machine learning algorithms.

Syllabus

Seminar Series Mehler’s formula, Branching process and Compositional Kernels of Deep Neural Networks


Taught by

VinAI

Related Courses

Neural Networks for Machine Learning
University of Toronto via Coursera
Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn
Statistical Learning with R
Stanford University via edX
Machine Learning 1—Supervised Learning
Brown University via Udacity
Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX