Walking the Random Forest and Boosting the Trees
Offered By: EuroPython Conference via YouTube
Course Description
Overview
Explore tree-based ensemble models in this EuroPython 2018 conference talk by Kevin Lemagnen. Dive into the world of Random Forest and Gradient Boosting, two powerful machine learning techniques that leverage bagging and boosting respectively. Learn how these ensemble models compare to Deep Learning and why they remain essential tools for data scientists. Discover their implementation in Python using popular libraries like LightGBM, XGBoost, and scikit-learn. Gain insights into the theory behind these models and their practical applications in solving a wide range of problems. Understand why ensemble models are often easier to tune and interpret than more complex alternatives. Follow along with the provided notebook to bridge the gap between theoretical concepts and hands-on implementation.
Syllabus
Introduction
Data
Random Forest
Boosting
Recommendations
Taught by
EuroPython Conference
Related Courses
Practical Machine LearningJohns Hopkins University via Coursera Detección de objetos
Universitat Autònoma de Barcelona (Autonomous University of Barcelona) via Coursera Practical Machine Learning on H2O
H2O.ai via Coursera Modélisez vos données avec les méthodes ensemblistes
CentraleSupélec via OpenClassrooms Introduction to Machine Learning for Coders!
fast.ai via Independent