Introduction to Attention-Based Neural Networks
Offered By: LinkedIn Learning
Course Description
Overview
Learn what attention-based models are, how they work, and what they can do for recurrent neural networks.
Syllabus
Introduction
- Prerequisites
- What are attention-based models?
- Attention in language generation and translation models
- Feed forward networks and their limitations
- Recurrent neural networks for sequential data
- The need for long memory cells
- LSTM and GRU cells
- Types of RRNNS
- Language generation models
- Sequence to sequence models for language translation
- The role of attention in sequence to sequence models
- Attention mechanism in sequence to sequence models
- Alignment weights in attention models
- Bahdanau attention
- Attention models for image captioning
- Encoder decoder structure for image captioning
- Setting up Colab and Google Drive
- Loading in the Flickr8k dataset
- Constructing the vocabulary
- Setting up the dataset class
- Implementing utility functions for training data
- Building the encoder CNN
- Building the decoder RNN
- Setting up the sequence to sequence model
- Training the image captioning model
- Loading the dataset and setting up utility functions
- The encoder CNN generating unrolled feature maps
- Implementing Bahdanau attention
- The decoder RNN using attention
- Generating captions using attention
- Training the attention-based image captioning model
- Visualizing the model's attention
- Summary and next steps
Taught by
Janani Ravi
Related Courses
Reinforcement Learning for Trading StrategiesNew York Institute of Finance via Coursera Natural Language Processing with Sequence Models
DeepLearning.AI via Coursera Fake News Detection with Machine Learning
Coursera Project Network via Coursera English/French Translator: Long Short Term Memory Networks
Coursera Project Network via Coursera Text Classification Using Word2Vec and LSTM on Keras
Coursera Project Network via Coursera