YoVDO

Continual Learning and Catastrophic Forgetting

Offered By: Paul Hand via YouTube

Tags

Artificial Intelligence Courses Algorithms Courses Deep Neural Networks Courses Catastrophic Forgetting Courses

Course Description

Overview

Explore continual learning and catastrophic forgetting in deep neural networks through this 42-minute lecture. Delve into the context, evaluation methods, and algorithms based on regularization, dynamic architectures, and Complementary Learning Systems. Examine data permutation tasks, incremental task learning, multimodal learning, Learning without Forgetting algorithm, Elastic Weight Consolidation, Progressive Neural Networks, and Generative replay. Gain insights from Northeastern University's CS 7150 Deep Learning course, with references to key research papers in the field. Access accompanying lecture notes for a comprehensive understanding of this crucial topic in machine learning.

Syllabus

Introduction
Context for continual learning
Training on new data
Catastrophic forgetting
Training from scratch
Replaying training data
Evaluating continual learning
Incremental class learning
Multimodal class learning
Strategies for continual learning
Regularization approaches
Learning without forgetting
Regularization
Elastic Weight Consolidation
Bayesian Learning Perspective
Progressive Neural Networks
generative replay
complementary learning systems


Taught by

Paul Hand

Related Courses

Active Dendrites Avoid Catastrophic Forgetting - Interview With the Authors
Yannic Kilcher via YouTube
Avoiding Catastrophe - Active Dendrites Enable Multi-Task Learning in Dynamic Environments
Yannic Kilcher via YouTube
Supermasks in Superposition - Paper Explained
Yannic Kilcher via YouTube
What Kind of AI Can Help Manufacturing Adapt to a Pandemic
Open Data Science via YouTube
Rethinking Architecture Design for Data Heterogeneity in FL - Liangqiong Qu
Stanford University via YouTube