YoVDO

The Complexity of Dynamic Least-Squares Regression

Offered By: Simons Institute via YouTube

Tags

Algorithm Design Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the complexity of dynamic least-squares regression in this comprehensive lecture from the Simons Institute. Delve into the sharp separations between amortized update times for fully vs. partially dynamic 0.01-LSR and high vs. low-accuracy LSR in the insertion-only setting. Examine the lower bounds derived from a gap-amplification reduction, connecting the exact version of the Online Matrix Vector Conjecture to its constant approximate version over the reals. Gain insights into the independent significance of this result in fine-grained complexity and its implications for the ongoing investigation of the OMv Conjecture. Learn about the challenges and advancements in maintaining ε-approximate solutions to min_{x^(t)} ||A^(t) x^(t) − b^(t)||_2 in dynamic environments where rows and labels can be adaptively inserted and/or deleted.

Syllabus

The Complexity of Dynamic Least-Squares Regression


Taught by

Simons Institute

Related Courses

Natural Language Processing
Columbia University via Coursera
Intro to Algorithms
Udacity
Conception et mise en œuvre d'algorithmes.
École Polytechnique via Coursera
Paradigms of Computer Programming
Université catholique de Louvain via edX
Data Structures and Algorithm Design Part I | 数据结构与算法设计(上)
Tsinghua University via edX