Neural Nets for NLP 2020 - Generating Trees Incrementally
Offered By: Graham Neubig via YouTube
Course Description
Overview
Syllabus
Intro
Two Common Types of Linguistic Structure
Semantic Parsing: Another Representative Tree Generation Task
Shift Reduce Example
Classification for Shift-reduce
Making Classification Decisions
What Features to Extract?
Why Tree Structure?
Recursive Neural Networks (Socher et al. 2011)
Why Linguistic Structure?
Clarification about Meaning Representations (MRS) Machine-executable MRs (our focus today) executable programs to accomplish a task MRs for Semantic Annotation capture the semantics of natural language sentences
Core Research Question for Better Models How to add inductive blases to networks a to better capture the structure of programs?
Summary: Supervised Learning of Semantic Parsers Key Question design decoders to follow the structure of programs
Taught by
Graham Neubig
Related Courses
Natural Language ProcessingColumbia University via Coursera Natural Language Processing
Stanford University via Coursera Introduction to Natural Language Processing
University of Michigan via Coursera moocTLH: Nuevos retos en las tecnologĂas del lenguaje humano
Universidad de Alicante via MirĂadax Natural Language Processing
Indian Institute of Technology, Kharagpur via Swayam