YoVDO

Logistic Regression and Ensemble Learning - Bagging and Boosting - AdaBoost

Offered By: Software Engineering Courses - SE Courses via YouTube

Tags

Ensemble Learning Courses Artificial Intelligence Courses Machine Learning Courses Probability Courses Classification Courses Logistic Regression Courses Adaboost Courses Bagging Courses

Course Description

Overview

Dive into a comprehensive 44-minute lecture on Logistic Regression and Ensemble Learning techniques, focusing on Bagging, Boosting, and AdaBoost. Explore the fundamentals of probability, classification, and the differences between regression and classification. Gain insights into Ensemble Learning methods, understanding their benefits and applications. Examine independent classifiers, their pros and cons, and the role of randomness in bagging. Discover when bagging is most effective and delve into boosting techniques, comparing strong and weak learners. Learn about the basic algorithm training process, weighted voting, and normalizing constants. Conclude with an in-depth look at AdaBoost and its application as a strong non-linear classifier using Decision Stumps. This lecture is part of a broader Artificial Intelligence and Machine Learning course, suitable for those with programming knowledge or experience with AI and ML tools.

Syllabus

Introduction
Probability
Classification
Regression vs Classification
Ensemble Learning
Benefits of Ensemble Learning
Independent Classifiers
Pros Cons
Randomness
When does bagging work
Boosting
Strong vs Weak Learners
Basic Algorithm Training
Weighted Vote
normalizing constant
AdaBoost
Strong NonLinear Classifier
Decision Stumps


Taught by

Software Engineering Courses - SE Courses

Related Courses

Introduction to Machine Learning
A Cloud Guru
Puppet Professional Certification - PPT206
A Cloud Guru
Advanced Machine Learning
The Open University via FutureLearn
AI and Machine Learning Essentials with Python
University of Pennsylvania via Coursera
Анализ данных
Novosibirsk State University via Coursera