YoVDO

General Linear Models - Regression

Offered By: statisticsmatt via YouTube

Tags

Linear Regression Courses Data Analysis Courses Regression Analysis Courses Model Evaluation Courses Multiple Linear Regression Courses Simple Linear Regression Courses Ridge Regression Courses

Course Description

Overview

Dive deep into the world of General Linear Models with a comprehensive 17-hour course focusing on regression analysis. Explore simple and multiple linear regression techniques, covering topics such as least squares estimation, matrix notation, hypothesis testing, and model diagnostics. Learn to interpret ANOVA tables, calculate confidence intervals, and assess model fit using various criteria. Delve into advanced concepts like multicollinearity, weighted least squares, ridge regression, and transformations. Master the use of residuals, influence measures, and partial regression plots for model evaluation. Gain practical skills in implementing regression techniques using R programming. Equip yourself with a thorough understanding of linear models and their applications in statistical analysis.

Syllabus

Introduction to Linear Models.
Simple Linear Regression.
Simple Linear Regression: Properties of Least Squares Estimators.
Simple Linear Regression: Estimating the Residual Variance.
Simple Linear regression: Matrix Notation.
Simple Linear Regression: Maximum Likelihood Estimation.
Simple Linear Regression: Partitioning Total Variability.
Simple Linear Regression: Matrix Notation for Sum of Squares.
Simple Linear Regression: ANOVA Table.
Simple Linear Regression: Testing the Model is Useful.
Simple Linear Regression: LSEs are Normally Distributed.
Simple Linear Regression: Confidence intervals for Beta Parameters.
Simple Linear Regression: Coefficient of Determination.
Simple Linear Regression:Confidence and Prediction Intervals on the Mean and Individual Response.
Simple Linear Regression: Simultaneous Inference on B0 and B1.
Simple Linear Regression: Bonferroni and Working-Hotelling Adjustments.
Simple Linear Regression: Residuals and their Properties.
Simple Linear Regression: X and Y Random.
Simple Linear Regression: Test for the Correlation Coefficient.
Simple Linear Regression: Fixed Zero Intercept Model.
Multiple Linear Regression: Introduction.
Multiple Linear Regression: Least Squares Estimates.
Multiple Linear Regression: The Hat Matrix.
Multiple Linear Regression: Estimating the Error Variance.
Multiple Linear Regression: Projection and Idempotent Matrices.
Multiple Linear Regression: Gauss Markov Theorem.
Multiple Linear Regression: Partitioning Total Variability.
Multiple Linear Regression: Type I Sum of Squares.
Multiple Linear Regression: Type II Sum of Squares.
Multiple Linear Regression: Global F Test.
Multiple Linear Regression: Partial F Tests.
Multiple Linear Regression: t Tests for a Single Beta Parameter.
Multiple Linear Regression: General Linear Hypotheses.
Using R: Simple Linear Regression from Scratch.
Multiple Linear Regression: CI/PI on the Mean and Individual Response.
Multiple Linear Regression: Simultaneous Inference of B'=(B0,B1, ... ,Bk).
Multiple Linear Regression: Partitioning the Residual Sum of Squares.
Multiple Linear Regression: Repeated Observations and Lack of Fit Test.
Multiple Linear Regression: Centering and Scaling the Design Matrix.
Multiple Linear Regression: Condition Number / Multicollinearity.
Multiple Linear Regression: Variance Inflation Factor (VIF) / Multicollinearity.
Multiple Linear Regression: Variance Proportions / Multicollinearity.
Multiple Linear Regression: Indicator / Dummy Variables.
Multiple Linear Regression: AIC (Akaike Information Criterion).
Multiple Linear Regression: Choosing a model with R2, Adjusted R2, and MSE.
Multiple Linear Regression: Mallow's Cp.
Multiple Linear Regression: Impact of Under or Over Fitting a Model.
Multiple Linear Regression: The PRESS Prediction SS Statistic.
Multiple Linear Regression: Residual Properties.
Weighted Least Squares Regression: Mahalanobis Distance.
Weighted Least Squares Regression: Hat Matrix.
Weighted Least Squares Regression: Estimability / BLUE.
Weighted Least Squares Regression: Estimating the Error Variance.
Weighted Least Squares Regression: Testing for Estimable Functions.
Weighted Least Squares Regression: Partial F Tests.
Multiple Linear Regression: Canonical Form.
Multiple Linear Regression: Canonical Form and Multicollinearity.
Multiple Linear Regression: Principal Components Model.
Ridge Regression (part 1 of 4): Variance Reduction.
Ridge Regression (part 2 of 4): Deriving the Bias.
Ridge Regression (part 3 of 4): Deriving from 1st principles..
Ridge Regression (part 4 of 4): Canonical Form.
Multiple Linear Regression: Box-Cox Transformation.
Multiple Linear Regression: Box - Tidwell Transformation.
Multiple Linear Regression: Studentized Residuals (Part 1 of 2).
Multiple Linear Regression: Studentized Residuals (Part 2 of 2).
Multiple Linear Regression: Partial Regression Plots (Added Variable Plots).
Multiple Linear Regression: Influence Measures (Part 1 of 2).
Multiple Linear Regression: Influence Measures (Part 2 of 2).
Best quadratic unbiased estimator of variance in a MLR model using Lagrange Multipliers.


Taught by

statisticsmatt

Related Courses

Learn the Basics of Machine Learning
Codecademy
Econometria Básica Aplicada
Universidade de São Paulo via Coursera
Incrementar - Parte 2 y Controlar
Tecnológico de Monterrey via Coursera
Linear Regression and Multiple Linear Regression in Julia
Coursera Project Network via Coursera
University Admission Prediction Using Multiple Linear Regression
Coursera Project Network via Coursera