XGBoost Part 2 - Classification
Offered By: StatQuest with Josh Starmer via YouTube
Course Description
Overview
Dive into the second part of a four-part video series on XGBoost, focusing on classification techniques. Learn how XGBoost trees are constructed for classification problems, building upon the regression concepts covered in part one. Explore key topics such as initial predictions, similarity scores, tree building, gain calculation, cover for classification, pruning, and the application of logistic regression. Gain a deeper understanding of how XGBoost adapts its algorithms for classification tasks, assuming prior knowledge of XGBoost trees for regression, gradient boost for classification, odds and log-odds, and the logistic function.
Syllabus
Intro
Overview
Initial Prediction
Similarity Scores
Building a Tree
Similarity Score
Gain
Cover
Cover for Classification
Pruning
Classification
Logistic Regression
Summary
Taught by
StatQuest with Josh Starmer
Related Courses
Statistical Learning with RStanford University via edX The Analytics Edge
Massachusetts Institute of Technology via edX Regression Models
Johns Hopkins University via Coursera Introduction à la statistique avec R
Université Paris SUD via France Université Numerique Statistical Reasoning for Public Health 2: Regression Methods
Johns Hopkins University via Coursera