Neural Nets for NLP 2019 - Unsupervised and Semi-supervised Learning of Structure
Offered By: Graham Neubig via YouTube
Course Description
Overview
Syllabus
Intro
Supervised, Unsupervised, Semi-supervised
Learning Features vs. Learning Discrete Structure
Unsupervised Feature Learning (Review)
How do we Use Learned Features?
What About Discrete Structure?
What is our Objective?
A Simple First Attempt
Hidden Markov Models w/ Gaussian Emissions . Instead of parameterizing each state with a categorical distribution, we can use a Gaussian (or Gaussian mixture)!
Problem: Embeddings May Not be Indicative of Syntax
Normalizing Flow (Rezende and Mohamed 2015)
Soft vs. Hard Tree Structure
One Other Paradigm: Weak Supervision
Gated Convolution (Cho et al. 2014)
Learning with RL (Yogatama et al. 2016)
Difficulties in Learning Latent Structure (Wiliams et al. 2018)
Phrase Structure vs. Dependency Structure
Learning Dependency Heads w/ Attention (Kuncoro et al. 2017)
Taught by
Graham Neubig
Related Courses
Neural Networks for Machine LearningUniversity of Toronto via Coursera Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn Statistical Learning with R
Stanford University via edX Machine Learning 1—Supervised Learning
Brown University via Udacity Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX