YoVDO

Neural Nets for NLP 2021 - Neural Nets + Knowledge Bases

Offered By: Graham Neubig via YouTube

Tags

Neural Networks Courses Natural Language Processing (NLP) Courses

Course Description

Overview

Explore neural networks and knowledge bases in this comprehensive lecture from CMU's Neural Networks for NLP course. Dive into methods for learning knowledge bases from neural embeddings and incorporating them into neural models. Discover techniques for probing language models for knowledge. Learn about WordNet, knowledge graph embeddings, relation extraction, distant supervision, retrofitting embeddings, and open information extraction. Examine the differences between modeling word embeddings and modeling relations. Investigate P-tuning for direct optimization of embeddings and compare nonparametric and parametric models in this informative 44-minute session led by Graham Neubig and Zhengbao Jiang.

Syllabus

Intro
Knowledge Bases
WordNet (Miller 1995)
Learning Knowledge Graph Embeddings (Bordes et al. 2013)
Remember: Consistency in Embeddings
Relation Extraction w/ Neural Tensor Networks (Socher et al. 2013)
Distant Supervision for Relation Extraction (Mintz et al. 2009)
Jointly Modeling KB Relations and Text (Toutanova et al. 2015)
Modeling Distant Supervision Noise in Neural Models (Lug et al. 2017)
Retrofitting of Embeddings to Existing Lexicons (Faruqui et al. 2015)
Open Information Extraction (Banko et al 2007)
Neural Models for Open IE
Modeling Word Embeddings vs. Modeling Relations
P-tuning: Directly Optimize Embeddings (Liu et al. 2021)
Nonparametric Models Outperform Parametric Models


Taught by

Graham Neubig

Related Courses

Neural Networks for Machine Learning
University of Toronto via Coursera
Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn
Statistical Learning with R
Stanford University via edX
Machine Learning 1—Supervised Learning
Brown University via Udacity
Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX