K-Means and K-Medians Under Dimension Reduction
Offered By: Simons Institute via YouTube
Course Description
Overview
Explore the intricacies of k-means and k-medians clustering algorithms under dimension reduction in this 42-minute lecture by Yury Makarychev from Toyota Technological Institute at Chicago. Delve into the Euclidean k-means and k-medians concepts, examining their behavior under dimension reduction. Investigate the challenges, warm-up exercises, and problem notation associated with these clustering techniques. Analyze the distortion graph, cost of clusters, and the concept of everywhere-sparse edges. Understand the (1-0) non-distorted core and the importance of large clusters. Examine the main combinatorial lemma and edges incident on outliers. Gain valuable insights into robust and high-dimensional statistics through this comprehensive talk presented at the Simons Institute.
Syllabus
Intro
Euclidean k-means and k-medians
k-means under dimension reduction
k-medians under dimension reduction
Plan
Out result for k-means
Challenges
Warm-up
Problem & Notation
Distortion graph
Cost of a cluster
Everywhere-sparse edges
(1-0) non-distorted core
All clusters are large
Main Combinatorial lemma
Edges Incident on Outliers
Summary
Taught by
Simons Institute
Related Courses
Cluster Analysis in Data MiningUniversity of Illinois at Urbana-Champaign via Coursera Explorez vos données avec des algorithmes non supervisés
CentraleSupélec via OpenClassrooms Clustering Geolocation Data Intelligently in Python
Coursera Project Network via Coursera Cluster Analysis using RCmdr
Coursera Project Network via Coursera Applied Machine Learning: Algorithms
LinkedIn Learning