Neural Nets for NLP - Class Introduction & Why Neural Nets?
Offered By: Graham Neubig via YouTube
Course Description
Overview
Explore the fundamentals of neural networks for natural language processing in this introductory lecture from CMU's Neural Networks for NLP course. Delve into example tasks and their challenges, discover how neural networks can address these issues, and gain insights into the basic concepts of neural network architectures for NLP prediction tasks. Learn about forward propagation, computation graphs, model parameters, and training processes using frameworks like DyNet. Examine the Continuous Bag of Words (CBOW) model and understand what vector representations signify in the context of NLP. Acquire essential knowledge to kickstart your journey into applying neural networks to natural language processing tasks.
Syllabus
Intro
Are These Sentences OK?
Engineering Solutions
Phenomena to Handle
An Example Prediction Problem: Sentence Classification
A First Try: Bag of Words (BOW)
Build It, Break It
Combination Features
Basic Idea of Neural Networks (for NLP Prediction Tasks)
An edge represents a function argument (and also an data dependency). They are just pointers to nodes
Algorithms (1)
Forward Propagation
Algorithms (2)
Basic Process in Dynamic Neural Network Frameworks
Computation Graph and Expressions
Model and Parameters
Parameter Initialization
Trainers and Backdrop
Training with DyNet
Continuous Bag of Words (CBOW) movie
What do Our Vectors Represent?
Things to Remember
Class Format
Taught by
Graham Neubig
Related Courses
Neural Networks for Machine LearningUniversity of Toronto via Coursera Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn Statistical Learning with R
Stanford University via edX Machine Learning 1—Supervised Learning
Brown University via Udacity Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX