YoVDO

Deep Learning for Natural Language Processing

Offered By: Alfredo Canziani via YouTube

Tags

Transformers Courses Deep Learning Courses Unsupervised Learning Courses BERT Courses Convolutional Neural Networks (CNN) Courses Recurrent Neural Networks (RNN) Courses Sequence to Sequence Models Courses Word2Vec Courses

Course Description

Overview

Explore the foundations and advanced concepts of deep learning for Natural Language Processing (NLP) in this comprehensive lecture. Delve into various architectures used in NLP applications, including CNNs, RNNs, and the state-of-the-art transformer model. Understand the modules that make transformers advantageous for NLP tasks and learn effective training techniques. Discover beam search as a middle ground between greedy decoding and exhaustive search, and explore "top-k" sampling for text generation. Examine sequence-to-sequence models, back-translation, and unsupervised learning approaches for embedding, including word2vec, GPT, and BERT. Gain insights into pre-training techniques for NLP and future directions in the field.

Syllabus

– Week 12 – Lecture
– Introduction to deep learning in NLP and language models
– Transformer language model structure and intuition
– Some tricks and facts of Transformer Language Models and decoding Language Models
– Beam Search, Sampling and Text Generation
– Back-translation, word2vec and BERT's
– Pre-training for NLP and Next Steps


Taught by

Alfredo Canziani

Tags

Related Courses

Natural Language Processing with Attention Models
DeepLearning.AI via Coursera
Large Language Models: Foundation Models from the Ground Up
Databricks via edX
Artificial Intelligence in Social Media Analytics
Johns Hopkins University via Coursera
Chatbots
Johns Hopkins University via Coursera
Embedding Models: From Architecture to Implementation
DeepLearning.AI via Coursera