Multiple and Logistic Regression in R
Offered By: DataCamp
Course Description
Overview
In this course you'll learn to add multiple variables to linear models and to use logistic regression for classification.
In this course you'll take your skills with simple linear regression to the next level. By learning multiple and logistic regression techniques you will gain the skills to model and predict both numeric and categorical outcomes using multiple input variables. You'll also learn how to fit, visualize, and interpret these models. Then you'll apply your skills to learn about Italian restaurants in New York City!
In this course you'll take your skills with simple linear regression to the next level. By learning multiple and logistic regression techniques you will gain the skills to model and predict both numeric and categorical outcomes using multiple input variables. You'll also learn how to fit, visualize, and interpret these models. Then you'll apply your skills to learn about Italian restaurants in New York City!
Syllabus
Parallel Slopes
-In this chapter you'll learn about the class of linear models called "parallel slopes models." These include one numeric and one categorical explanatory variable.
Evaluating and extending parallel slopes model
-This chapter covers model evaluation. By looking at different properties of the model, including the adjusted R-squared, you'll learn to compare models so that you can select the best one. You'll also learn about interaction terms in linear models.
Multiple Regression
-This chapter will show you how to add two, three, and even more numeric explanatory variables to a linear model.
Logistic Regression
-In this chapter you'll learn about using logistic regression, a generalized linear model (GLM), to predict a binary outcome and classify observations.
Case Study: Italian restaurants in NYC
-Explore the relationship between price and the quality of food, service, and decor for Italian restaurants in NYC.
-In this chapter you'll learn about the class of linear models called "parallel slopes models." These include one numeric and one categorical explanatory variable.
Evaluating and extending parallel slopes model
-This chapter covers model evaluation. By looking at different properties of the model, including the adjusted R-squared, you'll learn to compare models so that you can select the best one. You'll also learn about interaction terms in linear models.
Multiple Regression
-This chapter will show you how to add two, three, and even more numeric explanatory variables to a linear model.
Logistic Regression
-In this chapter you'll learn about using logistic regression, a generalized linear model (GLM), to predict a binary outcome and classify observations.
Case Study: Italian restaurants in NYC
-Explore the relationship between price and the quality of food, service, and decor for Italian restaurants in NYC.
Taught by
Ben Baumer
Related Courses
Macroeconometric ForecastingInternational Monetary Fund via edX Machine Learning With Big Data
University of California, San Diego via Coursera Data Science at Scale - Capstone Project
University of Washington via Coursera Structural Equation Model and its Applications | 结构方程模型及其应用 (粤语)
The Chinese University of Hong Kong via Coursera Data Science in Action - Building a Predictive Churn Model
SAP Learning