Neural Nets for NLP - Class Introduction & Why Neural Nets?
Offered By: Graham Neubig via YouTube
Course Description
Overview
Explore the fundamentals of neural networks for natural language processing in this introductory lecture from CMU's Neural Networks for NLP course. Delve into example tasks and their challenges, discover how neural networks can address these issues, and gain insights into the basic concepts of neural network architectures for NLP prediction tasks. Learn about forward propagation, computation graphs, model parameters, and training processes using frameworks like DyNet. Examine the Continuous Bag of Words (CBOW) model and understand what vector representations signify in the context of NLP. Acquire essential knowledge to kickstart your journey into applying neural networks to natural language processing tasks.
Syllabus
Intro
Are These Sentences OK?
Engineering Solutions
Phenomena to Handle
An Example Prediction Problem: Sentence Classification
A First Try: Bag of Words (BOW)
Build It, Break It
Combination Features
Basic Idea of Neural Networks (for NLP Prediction Tasks)
An edge represents a function argument (and also an data dependency). They are just pointers to nodes
Algorithms (1)
Forward Propagation
Algorithms (2)
Basic Process in Dynamic Neural Network Frameworks
Computation Graph and Expressions
Model and Parameters
Parameter Initialization
Trainers and Backdrop
Training with DyNet
Continuous Bag of Words (CBOW) movie
What do Our Vectors Represent?
Things to Remember
Class Format
Taught by
Graham Neubig
Related Courses
Natural Language ProcessingColumbia University via Coursera Natural Language Processing
Stanford University via Coursera Introduction to Natural Language Processing
University of Michigan via Coursera moocTLH: Nuevos retos en las tecnologĂas del lenguaje humano
Universidad de Alicante via MirĂadax Natural Language Processing
Indian Institute of Technology, Kharagpur via Swayam