YoVDO

Dimensionality Reduction via Distributed Persistence - DIPOLE

Offered By: Applied Algebraic Topology Network via YouTube

Tags

Dimensionality Reduction Courses Computational Topology Courses

Course Description

Overview

Explore dimensionality reduction through a novel gradient-descent-based approach called DIPOLE in this 59-minute conference talk. Delve into the two-term loss function, combining local metric and global topological preservation. Discover how distributed persistence, utilizing random small subsets, overcomes computational challenges in topological calculations. Examine the theoretical guarantees, including almost sure convergence, and compare DIPOLE's performance against t-SNE and UMAP on standard datasets. Learn about the computational concerns, distributed persistence metrics, and properties, including invertibility and Lipschitz inverse. Analyze qualitative and quantitative results, gaining insights into this innovative technique for dimensionality reduction in data analysis and visualization.

Syllabus

Intro
Dimensionality Reduction
The DIPOLE Philosophy
Computational Concerns
Distributed Persistence Metrics
Properties of Distributed Persistence
Some Simple Cases
Invertibility
Lipschitz Inverse
Summary of Distributed Persistence
A Return to DIPOLE: Distributed Persistence Optimized Local Embeddings
Qualitative Results
Quantitative Results


Taught by

Applied Algebraic Topology Network

Related Courses

Продвинутые методы машинного обучения
Higher School of Economics via Coursera
Natural Language Processing with Classification and Vector Spaces
DeepLearning.AI via Coursera
Machine Learning - Dimensionality Reduction
IBM via Cognitive Class
Machine Learning with Python
IBM via Cognitive Class
Predicting Extreme Climate Behavior with Machine Learning
University of Colorado Boulder via Coursera