Intermediate Regression in R
Offered By: DataCamp
Course Description
Overview
Learn to perform linear and logistic regression with multiple explanatory variables.
Linear regression and logistic regression are the two most widely used statistical models and act like master keys, unlocking the secrets hidden in datasets. This course builds on the skills you gained in "Introduction to Regression in R", covering linear and logistic regression with multiple explanatory variables. Through hands-on exercises, you’ll explore the relationships between variables in real-world datasets, Taiwan house prices and customer churn modeling, and more. By the end of this course, you’ll know how to include multiple explanatory variables in a model, understand how interactions between variables affect predictions, and understand how linear and logistic regression work.
Linear regression and logistic regression are the two most widely used statistical models and act like master keys, unlocking the secrets hidden in datasets. This course builds on the skills you gained in "Introduction to Regression in R", covering linear and logistic regression with multiple explanatory variables. Through hands-on exercises, you’ll explore the relationships between variables in real-world datasets, Taiwan house prices and customer churn modeling, and more. By the end of this course, you’ll know how to include multiple explanatory variables in a model, understand how interactions between variables affect predictions, and understand how linear and logistic regression work.
Syllabus
- Parallel Slopes
- Extend your linear regression skills to "parallel slopes" regression, with one numeric and one categorical explanatory variable. This is the first step towards conquering multiple linear regression.
- Interactions
- Explore the effect of interactions between explanatory variables. Considering interactions allows for more realistic models that can have better predictive power. You'll also deal with Simpson's Paradox: a non-intuitive result that arises when you have multiple explanatory variables.
- Multiple Linear Regression
- See how modeling, and linear regression in particular, makes it easy to work with more than two explanatory variables. Once you've mastered fitting linear regression models, you'll get to implement your own linear regression algorithm.
- Multiple Logistic Regression
- Extend your logistic regression skills to multiple explanatory variables. Understand the logistic distribution, which underpins this form of regression. Finally, implement your own logistic regression algorithm.
Taught by
Richie Cotton
Related Courses
Big DataUniversity of Adelaide via edX Advanced Reproducibility in Cancer Informatics
Johns Hopkins University via Coursera Advanced R Programming
Johns Hopkins University via Coursera Advanced Statistics for Data Science
Johns Hopkins University via Coursera Fundamentos de Ciencia de Datos con R
Universidad Anáhuac via edX