YoVDO

Understanding and Overcoming the Statistical Limitations of Decision Trees

Offered By: Institute for Pure & Applied Mathematics (IPAM) via YouTube

Tags

Machine Learning Courses Deep Learning Courses Statistical Analysis Courses Decision Trees Courses Random Forests Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a comprehensive lecture on the statistical limitations of decision trees and innovative approaches to overcome them. Delve into the performance gap between decision trees and more complex machine learning methods like random forests and deep learning. Examine sharp squared error generalization lower bounds for decision trees fitted to sparse additive generative models, and discover how these bounds connect to rate-distortion theory. Learn about the proposed Fast Interpretable Greedy-Tree Sums (FIGS) algorithm, which extends CART to grow multiple trees simultaneously. Investigate FIGS' ability to disentangle additive model components, reduce redundant splits, and improve prediction performance. Review experimental results across various datasets, showcasing FIGS' superiority over other rule-based methods in scenarios with limited splits. Gain insights into the application of FIGS in high-stakes domains, particularly its effectiveness in developing clinical decision instruments that outperform traditional tree-based methods by over 20%.

Syllabus

Abhineet Agarwal - Understanding and overcoming the statistical limitations of decision trees


Taught by

Institute for Pure & Applied Mathematics (IPAM)

Related Courses

Statistical Learning with R
Stanford University via edX
The Analytics Edge
Massachusetts Institute of Technology via edX
Machine Learning 1—Supervised Learning
Brown University via Udacity
The Caltech-JPL Summer School on Big Data Analytics
California Institute of Technology via Coursera
機器學習技法 (Machine Learning Techniques)
National Taiwan University via Coursera