Contrastive Methods and Regularised Latent Variable Models
Offered By: Alfredo Canziani via YouTube
Course Description
Overview
Dive into a comprehensive lecture on contrastive methods and regularized latent variable models in deep learning. Explore the advantages of contrastive methods in self-supervised learning, the architecture of denoising autoencoders, and other contrastive techniques like contrastive divergence. Examine regularized latent variable Energy-Based Models (EBMs) in detail, covering conditional and unconditional versions. Learn about algorithms such as ISTA, FISTA, and LISTA, and study examples of sparse coding and filters learned from convolutional sparse encoders. Conclude with an in-depth discussion on Variational Auto-Encoders and their underlying concepts. This lecture, delivered by Yann LeCun, is part of a broader deep learning course and offers nearly two hours of advanced content for those interested in cutting-edge machine learning techniques.
Syllabus
– Week 8 – Lecture
– Recap on EBM and Characteristics of Different Contrastive Methods
– Contrastive Methods in Self-Supervised Learning
– Denoising Autoencoder and other Contrastive methods
– Overview of Regularized Latent Variable Energy Based Models and Sparse Coding
– Convolutional Sparse Auto-Encoders
– Variational Auto-Encoders
Taught by
Alfredo Canziani
Tags
Related Courses
A Path Towards Autonomous Machine Intelligence - Paper ExplainedYannic Kilcher via YouTube Author Interview - VOS- Learning What You Don't Know by Virtual Outlier Synthesis
Yannic Kilcher via YouTube Self-Supervised Learning - The Dark Matter of Intelligence
Yannic Kilcher via YouTube Backpropagation and Deep Learning in the Brain
Simons Institute via YouTube On the Critic Function of Implicit Generative Models - Arthur Gretton
Institute for Advanced Study via YouTube