YoVDO

Memory as a Lens to Understand Efficient Learning and Optimization

Offered By: Institute for Pure & Applied Mathematics (IPAM) via YouTube

Tags

Machine Learning Courses Gradient Descent Courses Algorithm Design Courses Optimization Algorithms Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the role of memory in learning and optimization through this 48-minute lecture presented by Vatsal Sharan from the University of Southern California at IPAM's EnCORE Workshop. Delve into the optimal convergence rates for various optimization problems and examine whether simpler, faster, and memory-limited algorithms like gradient descent can achieve these rates. Discover a potential dichotomy between memory-efficient techniques and those requiring substantially more memory. Investigate how exploring memory-limited optimization reveals new problem structures and suggests novel variants of gradient descent. Gain insights into the relationship between computational efficiency and memory usage in optimization algorithms.

Syllabus

Vatsal Sharan - Memory as a lens to understand efficient learning and optimization - IPAM at UCLA


Taught by

Institute for Pure & Applied Mathematics (IPAM)

Related Courses

Introduction to Artificial Intelligence
Stanford University via Udacity
Natural Language Processing
Columbia University via Coursera
Probabilistic Graphical Models 1: Representation
Stanford University via Coursera
Computer Vision: The Fundamentals
University of California, Berkeley via Coursera
Learning from Data (Introductory Machine Learning course)
California Institute of Technology via Independent