Random Tessellation Forests: Overcoming the Curse of Dimensionality
Offered By: Hausdorff Center for Mathematics via YouTube
Course Description
Overview
Explore the advanced topic of random tessellation forests in this 58-minute lecture by Eliza O'Reilly from the Hausdorff Center for Mathematics. Delve into the limitations of traditional random forests using axis-aligned partitions and discover how oblique splits can improve performance by capturing feature dependencies. Examine the class of random tessellations forests generated by the stable under iteration (STIT) process in stochastic geometry, and learn how they achieve minimax optimal convergence rates for Lipschitz and C2 functions. Investigate the connection between stationary random tessellations and statistical learning theory, focusing on strategies to overcome the curse of dimensionality in high-dimensional feature spaces through optimal directional distribution choices for random tessellation forest estimators.
Syllabus
Eliza O’Reilly: Random tessellation forests: overcoming the curse of dimensionality
Taught by
Hausdorff Center for Mathematics
Related Courses
Dimensionality Reduction in PythonDataCamp Sparse Nonlinear Dynamics Models with SINDy - The Library of Candidate Nonlinearities
Steve Brunton via YouTube Overcoming the Curse of Dimensionality and Mode Collapse - Ke Li
Institute for Advanced Study via YouTube Emergent Linguistic Structure in Deep Contextual Neural Word Representations - Chris Manning
Institute for Advanced Study via YouTube Multilevel Weighted Least Squares Polynomial Approximation – Sören Wolfers, KAUST
Alan Turing Institute via YouTube