Neural Nets for NLP 2019 - Advanced Search Algorithms
Offered By: Graham Neubig via YouTube
Course Description
Overview
Syllabus
Intro
Why search?
Basic Pruning Methods (Steinbiss et al. 1994)
Prediction-based Pruning Methods (e.g. Stern et al. 2017)
Backtracking-based Pruning Methods
What beam size should use?
Variable length output sequences . In many tasks (eg MT), the output sequences will be of variable length
More complicated normalization Google's Neural Machine Translation System Bridging the Gap
Predict the output length (Eriguchi et al. 2016)
Why do Bigger Beams Hurt, pt. 2
Dealing with disparity in actions Ellective Inference for Generative Neural Parsing Mitchell Stam et al., 2017
Solution
Improving Diversity in top N Choices
Improving Diversity through Sampling
Sampling without Replacement (con't)
Monte-Carlo Tree Search Human-like Natural Language Generation Using Monte Carlo Tree Search
More beam search in training A Continuous Relaxation of Bear Search for End-to-end Training of Neural Sequence Models (Goyal et al., 2017)
Adoption with neural networks: CCG Parsing
Is the heuristic admissible? (Lee et al. 2016)
Estimating future costs Li et al., 2017
Actor Critic (Bahdanau et. al., 2017)
Actor Critic (continued)
A* search: benefits and drawbacks
Particle Filters (Buys et al., 2015)
Reranking (Dyer et al. 2016)
Taught by
Graham Neubig
Related Courses
GGP Course VideosStanford University via YouTube AlphaGo - Mastering the Game of Go with Deep Neural Networks and Tree Search - RL Paper Explained
Aleksa Gordić - The AI Epiphany via YouTube How Slot Machines Are Advancing the State of the Art in Computer Go AI
Churchill CompSci Talks via YouTube CMU Neural Nets for NLP 2017 - Advanced Search Algorithms
Graham Neubig via YouTube From Tic Tac Toe to AlphaGo - Playing Games with AI and Machine Learning
Devoxx via YouTube