Better Understanding of Non-convex Methods in Machine Learning
Offered By: Paul G. Allen School via YouTube
Course Description
Overview
Explore the frontiers of non-convex optimization in machine learning through this illuminating lecture by Princeton University's Tengyu Ma. Delve into recent breakthroughs in deep learning and the challenges of analyzing complex, high-dimensional models trained on massive datasets. Discover new algorithmic approaches and analysis tools for non-convex methods, including insights on matrix completion solved by stochastic gradient descent. Examine the landscape of objective functions for linearized recurrent neural networks and residual networks, and learn how over-parameterization and re-parameterization can simplify optimization processes. Gain valuable knowledge about the formal study of non-convex methods and their applications in advancing machine learning techniques.
Syllabus
Allen School Colloquia: Tengyu Ma (Princeton University)
Taught by
Paul G. Allen School
Related Courses
Neural Networks for Machine LearningUniversity of Toronto via Coursera 機器學習技法 (Machine Learning Techniques)
National Taiwan University via Coursera Machine Learning Capstone: An Intelligent Application with Deep Learning
University of Washington via Coursera Прикладные задачи анализа данных
Moscow Institute of Physics and Technology via Coursera Leading Ambitious Teaching and Learning
Microsoft via edX