YoVDO

Ensemble Methods in Python

Offered By: DataCamp

Tags

Python Courses Machine Learning Courses Ensemble Learning Courses Adaboost Courses Bagging Courses

Course Description

Overview

Learn how to build advanced and effective machine learning models in Python using ensemble techniques such as bagging, boosting, and stacking.

Continue your machine learning journey by diving into the wonderful world of ensemble learning methods! These are an exciting class of machine learning techniques that combine multiple individual algorithms to boost performance and solve complex problems at scale across different industries. Ensemble techniques regularly win online machine learning competitions as well!
In this course, you’ll learn all about these advanced ensemble techniques, such as bagging, boosting, and stacking. You’ll apply them to real-world datasets using cutting edge Python machine learning libraries such as scikit-learn, XGBoost, CatBoost, and mlxtend.

Syllabus

  • Combining Multiple Models
    • Do you struggle to determine which of the models you built is the best for your problem? You should give up on that, and use them all instead! In this chapter, you'll learn how to combine multiple models into one using "Voting" and "Averaging". You'll use these to predict the ratings of apps on the Google Play Store, whether or not a Pokémon is legendary, and which characters are going to die in Game of Thrones!
  • Bagging
    • Bagging is the ensemble method behind powerful machine learning algorithms such as random forests. In this chapter you'll learn the theory behind this technique and build your own bagging models using scikit-learn.
  • Boosting
    • Boosting is class of ensemble learning algorithms that includes award-winning models such as AdaBoost. In this chapter, you'll learn about this award-winning model, and use it to predict the revenue of award-winning movies! You'll also learn about gradient boosting algorithms such as CatBoost and XGBoost.
  • Stacking
    • Get ready to see how things stack up! In this final chapter you'll learn about the stacking ensemble method. You'll learn how to implement it using scikit-learn as well as with the mlxtend library! You'll apply stacking to predict the edibility of North American mushrooms, and revisit the ratings of Google apps with this more advanced approach.

Taught by

Román de las Heras

Related Courses

Detección de objetos
Universitat Autònoma de Barcelona (Autonomous University of Barcelona) via Coursera
Ensemble Machine Learning in Python: Random Forest, AdaBoost
Udemy
Decision Trees, Random Forests, Bagging & XGBoost: R Studio
Udemy
Introduction to Machine Learning: Supervised Learning
University of Colorado Boulder via Coursera
Decision Trees, Random Forests, AdaBoost & XGBoost in Python
Udemy