Introductory Lectures on First-Order Convex Optimization - Lecture 1
Offered By: International Centre for Theoretical Sciences via YouTube
Course Description
Overview
Dive into the fundamentals of first-order convex optimization in this comprehensive lecture by Praneeth Netrapalli. Explore gradient-based optimization techniques, including gradient descent and Nesterov's accelerated gradients algorithm. Examine the complexity of implementing oracles and optimization processes, and delve into key theorems, proofs, and lower bounds. Gain insights into smoothness concepts and estimate sequences. Analyze rearrangements, telescopic sums, and crucial observations to deepen your understanding of convex optimization principles. Perfect for advanced graduate students, postdocs, and researchers in theoretical physics and computer science seeking to enhance their knowledge of machine learning and statistical physics applications.
Syllabus
Introductory lectures on first-order convex optimization Lecture 1
Gradient based optimization
Complexity of implementing an oracle and Complexity of optimization given access to an oracle
Gradient Descent
Theorem
Remark
Proof
Rearrange and telescopic sum gives
Lower bounds: Theorem
Smoothness
Theorem
Proof
Nesterov's accelerated gradients algorithm
Estimate Sequences
Lemma
Proof
Observation
Compute
Taught by
International Centre for Theoretical Sciences
Related Courses
Introduction to Artificial IntelligenceStanford University via Udacity Natural Language Processing
Columbia University via Coursera Probabilistic Graphical Models 1: Representation
Stanford University via Coursera Computer Vision: The Fundamentals
University of California, Berkeley via Coursera Learning from Data (Introductory Machine Learning course)
California Institute of Technology via Independent