Analyzing Optimization and Generalization in Deep Learning via Trajectories of Gradient Descent
Offered By: Simons Institute via YouTube
Course Description
Overview
Explore the intricacies of deep learning optimization and generalization through an examination of gradient descent trajectories in this 46-minute lecture by Nadav Cohen from the Institute for Advanced Study. Delve into the Frontiers of Deep Learning as part of the Simons Institute series, gaining insights into the complex interplay between optimization techniques and the ability of deep learning models to generalize effectively. Uncover the mathematical foundations and practical implications of gradient descent trajectories, enhancing your understanding of how these factors contribute to the performance and reliability of deep learning systems.
Syllabus
Analyzing Optimization and Generalization in Deep Learning via Trajectories of Gradient Descent
Taught by
Simons Institute
Related Courses
Neural Networks for Machine LearningUniversity of Toronto via Coursera 機器學習技法 (Machine Learning Techniques)
National Taiwan University via Coursera Machine Learning Capstone: An Intelligent Application with Deep Learning
University of Washington via Coursera Прикладные задачи анализа данных
Moscow Institute of Physics and Technology via Coursera Leading Ambitious Teaching and Learning
Microsoft via edX