YoVDO

Continual Learning and Catastrophic Forgetting

Offered By: Paul Hand via YouTube

Tags

Artificial Intelligence Courses Algorithms Courses Deep Neural Networks Courses Catastrophic Forgetting Courses

Course Description

Overview

Explore continual learning and catastrophic forgetting in deep neural networks through this 42-minute lecture. Delve into the context, evaluation methods, and algorithms based on regularization, dynamic architectures, and Complementary Learning Systems. Examine data permutation tasks, incremental task learning, multimodal learning, Learning without Forgetting algorithm, Elastic Weight Consolidation, Progressive Neural Networks, and Generative replay. Gain insights from Northeastern University's CS 7150 Deep Learning course, with references to key research papers in the field. Access accompanying lecture notes for a comprehensive understanding of this crucial topic in machine learning.

Syllabus

Introduction
Context for continual learning
Training on new data
Catastrophic forgetting
Training from scratch
Replaying training data
Evaluating continual learning
Incremental class learning
Multimodal class learning
Strategies for continual learning
Regularization approaches
Learning without forgetting
Regularization
Elastic Weight Consolidation
Bayesian Learning Perspective
Progressive Neural Networks
generative replay
complementary learning systems


Taught by

Paul Hand

Related Courses

Information Theory
The Chinese University of Hong Kong via Coursera
Intro to Computer Science
University of Virginia via Udacity
Analytic Combinatorics, Part I
Princeton University via Coursera
Algorithms, Part I
Princeton University via Coursera
Divide and Conquer, Sorting and Searching, and Randomized Algorithms
Stanford University via Coursera