Joint Embedding Method and Latent Variable Energy Based Models
Offered By: Alfredo Canziani via YouTube
Course Description
Overview
Syllabus
– Welcome to class
– Predictive models
– Multi-output system
– Notation factor graph
– The energy function Fx, y
– Inference
– Implicit function
– Conditional EBM
– Unconditional EBM
– EBM vs. probabilistic models
– Do we need a y at inference?
– When inference is hard
– Joint embeddings
– Latent variables
– Inference with latent variables
– Energies E and F
– Preview on the EBM practicum
– From energy to probabilities
– Examples: K-means and sparse coding
– Limiting the information capacity of the latent variable
– Training EBMs
– Maximum likelihood
– How to pick β?
– Problems with maximum likelihood
– Other types of loss functions
– Generalised margin loss
– General group loss
– Contrastive joint embeddings
– Denoising or mask autoencoder
– Summary and final remarks
Taught by
Alfredo Canziani
Tags
Related Courses
Fundamentals of Quantitative ModelingUniversity of Pennsylvania via Coursera Теория вероятностей – наука о случайности
Tomsk State University via Stepik Statistics and Data Science
Massachusetts Institute of Technology via edX Natural Language Processing with Probabilistic Models
DeepLearning.AI via Coursera Natural Language Processing
DeepLearning.AI via Coursera