YoVDO

XGBoost Part 4 - Crazy Cool Optimizations

Offered By: StatQuest with Josh Starmer via YouTube

Tags

XGBoost Courses Classification Courses

Course Description

Overview

Explore advanced optimizations for XGBoost in this fourth and final video of the series. Dive into techniques for handling large training datasets, including the Approximate Greedy Algorithm, Parallel Learning, Weighted Quantile Sketch, Sparsity-Aware Split Finding, Cache-Aware Access, and Blocks for Out-of-Core Computation. Learn step-by-step how XGBoost efficiently manages missing data, utilizes default paths, and optimizes performance for massive datasets. Gain insights into the practical applications of these advanced concepts in machine learning and data analysis.

Syllabus

Intro
Overview
Greedy Algorithm Limitations
Approximate Greedy Algorithm
Weighted Quantile Sketch
What is a Weighted Quantile
Weighted Quantiles in Classification
SparsityAware Split Finding
CacheAware Access
Core Computation
Random Subsets
Summary


Taught by

StatQuest with Josh Starmer

Related Courses

علم اجتماع المايكروبات
King Saud University via Rwaq (رواق)
Statistical Learning with R
Stanford University via edX
More Data Mining with Weka
University of Waikato via Independent
The Caltech-JPL Summer School on Big Data Analytics
California Institute of Technology via Coursera
Machine Learning for Musicians and Artists
Goldsmiths University of London via Kadenze