10 Decision Trees are Better Than 1 - Random Forest and AdaBoost
Offered By: Shaw Talebi via YouTube
Course Description
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the power of combining multiple decision trees into tree ensembles in this informative video. Delve into the two main types of tree ensembles: bagging (Random Forest) and boosting (AdaBoost, Gradient Boosting, XGBoost). Discover the three key benefits of using tree ensembles in machine learning. Follow along with a practical example of breast cancer prediction using ensemble methods. Access additional resources, including a blog post and example code, to further enhance your understanding of decision tree ensembles. Part of a comprehensive series on decision trees, this 17-minute tutorial provides valuable insights for both beginners and experienced data scientists looking to improve their predictive modeling skills.
Syllabus
Intro -
Tree Ensembles -
2 Types of Tree Ensembles -
1 Bagging Random Forest-
2 Boosting AdaBoost, Gradient Boosting, XGBoost -
3 Benefits of Tree Ensembles -
Example Code: Breast Cancer Prediction -
Taught by
Shaw Talebi
Related Courses
Compare time series predictions of COVID-19 deathsCoursera Project Network via Coursera Credit Risk Modeling in Python
DataCamp Extreme Gradient Boosting with XGBoost
DataCamp Supervised Learning in R: Regression
DataCamp Intermediate Machine Learning
Kaggle