YoVDO

Introductory Lectures on First-Order Convex Optimization - Lecture 1

Offered By: International Centre for Theoretical Sciences via YouTube

Tags

Convex Optimization Courses Machine Learning Courses Gradient Descent Courses Complexity Theory Courses Statistical Physics Courses

Course Description

Overview

Dive into the fundamentals of first-order convex optimization in this comprehensive lecture by Praneeth Netrapalli. Explore gradient-based optimization techniques, including gradient descent and Nesterov's accelerated gradients algorithm. Examine the complexity of implementing oracles and optimization processes, and delve into key theorems, proofs, and lower bounds. Gain insights into smoothness concepts and estimate sequences. Analyze rearrangements, telescopic sums, and crucial observations to deepen your understanding of convex optimization principles. Perfect for advanced graduate students, postdocs, and researchers in theoretical physics and computer science seeking to enhance their knowledge of machine learning and statistical physics applications.

Syllabus

Introductory lectures on first-order convex optimization Lecture 1
Gradient based optimization
Complexity of implementing an oracle and Complexity of optimization given access to an oracle
Gradient Descent
Theorem
Remark
Proof
Rearrange and telescopic sum gives
Lower bounds: Theorem
Smoothness
Theorem
Proof
Nesterov's accelerated gradients algorithm
Estimate Sequences
Lemma
Proof
Observation
Compute


Taught by

International Centre for Theoretical Sciences

Related Courses

Advanced statistical physics
École Polytechnique Fédérale de Lausanne via edX
Физика как глобальный проект
National Research Nuclear University MEPhI via Coursera
Statistical Physics of Non-Interacting and Interacting Systems
Indian Institute of Technology Guwahati via Swayam
A Statistical Physicist Looks at Some Complex Systems
Santa Fe Institute via YouTube
Statistical Mechanics in Graduate Physics Education - 2018 Jackson Award Lecture
physicsteachers via YouTube