Neural Nets for NLP 2020 - Neural Nets + Knowledge Bases
Offered By: Graham Neubig via YouTube
Course Description
Overview
Syllabus
Intro
Knowledge Bases . Structured databases of knowledge usually containing
WordNet (Miller 1995)
Decomposable Relation Model (Xie et al. 2017) • Idea: There are many relations, but each can be represented by a limited number of concepts • Method: Treat each relation map as a mixture of concepts, with sparse mixture vector a
Multi-hop Relational Context w/ Graph Neural Networks (Schlichtbruil et al., 2017)
Knowledge Base Incompleteness
Relation Extraction w/ Neural Tensor Networks (Socher et al. 2013)
Distant Supervision for Relation Extraction (Mintz et al. 2009)
Relation Classification w/ CNNS (Zeng et al. 2014)
Jointly Modeling KB Relations and Text (Toutanova et al. 2015) To model textual links between words w neural net: aggregate over multiple instances of links independency tree
Modeling Distant Supervision Noise in Neural Models (Luo et al. 2017) • Idea: there is noise in distant Supervision labels, so we want to model it
Retrofitting of Embeddings to Existing Lexicons (Faruqui et al. 2015)
Reasoning over Text Corpus as a Knowledge Base (Dhingra et al. 2020)
Taught by
Graham Neubig
Related Courses
Natural Language ProcessingColumbia University via Coursera Natural Language Processing
Stanford University via Coursera Introduction to Natural Language Processing
University of Michigan via Coursera moocTLH: Nuevos retos en las tecnologías del lenguaje humano
Universidad de Alicante via Miríadax Natural Language Processing
Indian Institute of Technology, Kharagpur via Swayam