YoVDO

AdaBoost, Clearly Explained

Offered By: StatQuest with Josh Starmer via YouTube

Tags

Machine Learning Courses Classification Courses Decision Trees Courses Random Forests Courses Adaboost Courses

Course Description

Overview

Dive into a comprehensive video tutorial that demystifies AdaBoost, a powerful machine learning algorithm. Learn how this method builds upon decision trees and random forests to create a robust ensemble model. Explore the three main ideas behind AdaBoost, including building stumps with the GINI index, determining the "Amount of Say" for each stump, and updating sample weights. Follow along as the tutorial guides you through the process of normalizing weights, creating subsequent stumps, and using the ensemble to make classifications. Gain a clear understanding of AdaBoost's inner workings through step-by-step explanations, visual aids, and a thorough review of key concepts.

Syllabus

Awesome song and introduction
The three main ideas behind AdaBoost
Review of the three main ideas
Building a stump with the GINI index
Determining the Amount of Say for a stump
Updating sample weights
Normalizing the sample weights
Using the normalized weights to make the second stump
Using stumps to make classifications
Review of the three main ideas behind AdaBoost
. The Amount of Say for Chest Pain = 1/2*log1-3/8/3/8 = 1/2*log5/8/3/8 = 1/2*log5/3 = 0.25, not 0.42.


Taught by

StatQuest with Josh Starmer

Related Courses

Statistical Learning with R
Stanford University via edX
The Analytics Edge
Massachusetts Institute of Technology via edX
Machine Learning 1—Supervised Learning
Brown University via Udacity
The Caltech-JPL Summer School on Big Data Analytics
California Institute of Technology via Coursera
機器學習技法 (Machine Learning Techniques)
National Taiwan University via Coursera