XGBoost Part 2 - Classification
Offered By: StatQuest with Josh Starmer via YouTube
Course Description
Overview
Dive into the second part of a four-part video series on XGBoost, focusing on classification techniques. Learn how XGBoost trees are constructed for classification problems, building upon the regression concepts covered in part one. Explore key topics such as initial predictions, similarity scores, tree building, gain calculation, cover for classification, pruning, and the application of logistic regression. Gain a deeper understanding of how XGBoost adapts its algorithms for classification tasks, assuming prior knowledge of XGBoost trees for regression, gradient boost for classification, odds and log-odds, and the logistic function.
Syllabus
Intro
Overview
Initial Prediction
Similarity Scores
Building a Tree
Similarity Score
Gain
Cover
Cover for Classification
Pruning
Classification
Logistic Regression
Summary
Taught by
StatQuest with Josh Starmer
Related Courses
Statistical Data Visualization with SeabornCoursera Project Network via Coursera Compare time series predictions of COVID-19 deaths
Coursera Project Network via Coursera Machine Learning con Python. Nivel Avanzado
Coursera Project Network via Coursera Complete Machine Learning with R Studio - ML for 2024
Udemy Modern Artificial Intelligence Masterclass: Build 6 Projects
Udemy