Mad Max - Affine Spline Insights into Deep Learning
Offered By: Simons Institute via YouTube
Course Description
Overview
Explore the connections between deep neural networks and spline theory in this 48-minute lecture by Richard Baraniuk from Rice University. Delve into the fundamentals of deep nets and splines, focusing on max-affine splines (MAS) and max-affine spline operators (MASO). Examine spline approximation techniques and various types of splines. Investigate the MASO spline partition and its role in learning, as well as the geometry of MASO partitions. Discover how convolutional neural networks (CNNs) relate to local affine mappings and how deep nets function as matched filterbanks. Analyze concepts such as data memorization, deep net complexity, and the impact of data augmentation. Gain insights into piecewise affine nets and explore potential future directions in deep learning research.
Syllabus
Intro
deep nets and splines
spline approximation
kinds of splines
max-affine spline (MAS)
max-affine spline operator (MASO)
theorem
MASO spline partition
learning
geometry of the MASO partition
a conclusion
local affine mapping - CNN
deep nets are matched filterbanks
data memorization
deep net complexity
understanding data augmentation
beyond piecewise affine nets
Taught by
Simons Institute
Related Courses
Neural Networks for Machine LearningUniversity of Toronto via Coursera 機器學習技法 (Machine Learning Techniques)
National Taiwan University via Coursera Machine Learning Capstone: An Intelligent Application with Deep Learning
University of Washington via Coursera Прикладные задачи анализа данных
Moscow Institute of Physics and Technology via Coursera Leading Ambitious Teaching and Learning
Microsoft via edX