YoVDO

Machine Learning with Tree-Based Models in R

Offered By: DataCamp

Tags

R Programming Courses Machine Learning Courses Classification Courses Decision Trees Courses Random Forests Courses ROC Curve Courses Hyperparameter Tuning Courses

Course Description

Overview

Learn how to use tree-based models and ensembles to make classification and regression predictions with tidymodels.

Tree-based machine learning models can reveal complex non-linear relationships in data and often dominate machine learning competitions. In this course, you'll use the tidymodels package to explore and build different tree-based models—from simple decision trees to complex random forests. You’ll also learn to use boosted trees, a powerful machine learning technique that uses ensemble learning to build high-performing predictive models. Along the way, you'll work with health and credit risk data to predict the incidence of diabetes and customer churn.

Syllabus

  • Classification Trees
    • Ready to build a real machine learning pipeline? Complete step-by-step exercises to learn how to create decision trees, split your data, and predict which patients are most likely to suffer from diabetes. Last but not least, you’ll build performance measures to assess your models and judge your predictions.
  • Regression Trees and Cross-Validation
    • Ready for some candy? Use a chocolate rating dataset to build regression trees and assess their performance using suitable error measures. You’ll overcome statistical insecurities of single train/test splits by applying sweet techniques like cross-validation and then dive even deeper by mastering the bias-variance tradeoff.
  • Hyperparameters and Ensemble Models
    • Time to get serious with tuning your hyperparameters and interpreting receiver operating characteristic (ROC) curves. In this chapter, you’ll leverage the wisdom of the crowd with ensemble models like bagging or random forests and build ensembles that forecast which credit card customers are most likely to churn.
  • Boosted Trees
    • Ready for the high society of tree-based models? Apply gradient boosting to create powerful ensembles that perform better than anything that you have seen or built. Learn about their fine-tuning and how to compare different models to pick a winner for production.

Taught by

Sandro Raabe

Related Courses

Practical Machine Learning
Johns Hopkins University via Coursera
Detección de objetos
Universitat Autònoma de Barcelona (Autonomous University of Barcelona) via Coursera
Practical Machine Learning on H2O
H2O.ai via Coursera
Modélisez vos données avec les méthodes ensemblistes
CentraleSupélec via OpenClassrooms
Introduction to Machine Learning for Coders!
fast.ai via Independent