Joint Embedding Method and Latent Variable Energy Based Models
Offered By: Alfredo Canziani via YouTube
Course Description
Overview
Syllabus
– Welcome to class
– Predictive models
– Multi-output system
– Notation factor graph
– The energy function Fx, y
– Inference
– Implicit function
– Conditional EBM
– Unconditional EBM
– EBM vs. probabilistic models
– Do we need a y at inference?
– When inference is hard
– Joint embeddings
– Latent variables
– Inference with latent variables
– Energies E and F
– Preview on the EBM practicum
– From energy to probabilities
– Examples: K-means and sparse coding
– Limiting the information capacity of the latent variable
– Training EBMs
– Maximum likelihood
– How to pick β?
– Problems with maximum likelihood
– Other types of loss functions
– Generalised margin loss
– General group loss
– Contrastive joint embeddings
– Denoising or mask autoencoder
– Summary and final remarks
Taught by
Alfredo Canziani
Tags
Related Courses
Discrete Inference and Learning in Artificial VisionÉcole Centrale Paris via Coursera Teaching Literacy Through Film
The British Film Institute via FutureLearn Linear Regression and Modeling
Duke University via Coursera Probability and Statistics
Stanford University via Stanford OpenEdx Statistical Reasoning
Stanford University via Stanford OpenEdx