Building Regression Models with scikit-learn
Offered By: Pluralsight
Course Description
Overview
This course covers important techniques such as ordinary least squares regression, moving on to lasso, ridge, and Elastic Net, and advanced techniques such as Support Vector Regression and Stochastic Gradient Descent Regression.
Regression is one of the most widely used modeling techniques and is much beloved by everyone ranging from business professionals to data scientists. Using scikit-learn, you can easily implement virtually every important type of regression with ease. In this course, Building Regression Models with scikit-learn, you will gain the ability to enumerate the different types of regression algorithms and correctly implement them in scikit-learn. First, you will learn what regression seeks to achieve, and how the ubiquitous Ordinary Least Squares algorithm works under the hood. Next, you will discover how to implement other techniques that mitigate overfittings such as Lasso, Ridge and Elastic Net regression. You will then understand other more advanced forms of regression, including those using Support Vector Machines, Decision Trees and Stochastic Gradient Descent. Finally, you will round out the course by understanding the hyperparameters that these various regression models possess, and how these can be optimized. When you are finished with this course, you will have the skills and knowledge to select the correct regression algorithm based on the problem you are trying to solve, and also implement it correctly using scikit-learn.
Regression is one of the most widely used modeling techniques and is much beloved by everyone ranging from business professionals to data scientists. Using scikit-learn, you can easily implement virtually every important type of regression with ease. In this course, Building Regression Models with scikit-learn, you will gain the ability to enumerate the different types of regression algorithms and correctly implement them in scikit-learn. First, you will learn what regression seeks to achieve, and how the ubiquitous Ordinary Least Squares algorithm works under the hood. Next, you will discover how to implement other techniques that mitigate overfittings such as Lasso, Ridge and Elastic Net regression. You will then understand other more advanced forms of regression, including those using Support Vector Machines, Decision Trees and Stochastic Gradient Descent. Finally, you will round out the course by understanding the hyperparameters that these various regression models possess, and how these can be optimized. When you are finished with this course, you will have the skills and knowledge to select the correct regression algorithm based on the problem you are trying to solve, and also implement it correctly using scikit-learn.
Taught by
Janani Ravi
Related Courses
Artificial Intelligence for RoboticsStanford University via Udacity Intro to Computer Science
University of Virginia via Udacity Design of Computer Programs
Stanford University via Udacity Web Development
Udacity Programming Languages
University of Virginia via Udacity