Neural Nets for NLP 2019 - Unsupervised and Semi-supervised Learning of Structure
Offered By: Graham Neubig via YouTube
Course Description
Overview
Syllabus
Intro
Supervised, Unsupervised, Semi-supervised
Learning Features vs. Learning Discrete Structure
Unsupervised Feature Learning (Review)
How do we Use Learned Features?
What About Discrete Structure?
What is our Objective?
A Simple First Attempt
Hidden Markov Models w/ Gaussian Emissions . Instead of parameterizing each state with a categorical distribution, we can use a Gaussian (or Gaussian mixture)!
Problem: Embeddings May Not be Indicative of Syntax
Normalizing Flow (Rezende and Mohamed 2015)
Soft vs. Hard Tree Structure
One Other Paradigm: Weak Supervision
Gated Convolution (Cho et al. 2014)
Learning with RL (Yogatama et al. 2016)
Difficulties in Learning Latent Structure (Wiliams et al. 2018)
Phrase Structure vs. Dependency Structure
Learning Dependency Heads w/ Attention (Kuncoro et al. 2017)
Taught by
Graham Neubig
Related Courses
機械学習・深層学習 (ga120)Waseda University via gacco What are GAN's actually- from underlying math to python code
Udemy Artificial Intelligence Foundations: Machine Learning
LinkedIn Learning HyperTransformer - Model Generation for Supervised and Semi-Supervised Few-Shot Learning
Yannic Kilcher via YouTube Big Self-Supervised Models Are Strong Semi-Supervised Learners
Yannic Kilcher via YouTube