Joint Embedding Method and Latent Variable Energy Based Models
Offered By: Alfredo Canziani via YouTube
Course Description
Overview
Syllabus
– Welcome to class
– Predictive models
– Multi-output system
– Notation factor graph
– The energy function Fx, y
– Inference
– Implicit function
– Conditional EBM
– Unconditional EBM
– EBM vs. probabilistic models
– Do we need a y at inference?
– When inference is hard
– Joint embeddings
– Latent variables
– Inference with latent variables
– Energies E and F
– Preview on the EBM practicum
– From energy to probabilities
– Examples: K-means and sparse coding
– Limiting the information capacity of the latent variable
– Training EBMs
– Maximum likelihood
– How to pick β?
– Problems with maximum likelihood
– Other types of loss functions
– Generalised margin loss
– General group loss
– Contrastive joint embeddings
– Denoising or mask autoencoder
– Summary and final remarks
Taught by
Alfredo Canziani
Tags
Related Courses
Neural Networks for Machine LearningUniversity of Toronto via Coursera 機器學習技法 (Machine Learning Techniques)
National Taiwan University via Coursera Machine Learning Capstone: An Intelligent Application with Deep Learning
University of Washington via Coursera Прикладные задачи анализа данных
Moscow Institute of Physics and Technology via Coursera Leading Ambitious Teaching and Learning
Microsoft via edX