Introduction to Attention-Based Neural Networks
Offered By: LinkedIn Learning
Course Description
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Learn what attention-based models are, how they work, and what they can do for recurrent neural networks.
Syllabus
Introduction
- Prerequisites
- What are attention-based models?
- Attention in language generation and translation models
- Feed forward networks and their limitations
- Recurrent neural networks for sequential data
- The need for long memory cells
- LSTM and GRU cells
- Types of RRNNS
- Language generation models
- Sequence to sequence models for language translation
- The role of attention in sequence to sequence models
- Attention mechanism in sequence to sequence models
- Alignment weights in attention models
- Bahdanau attention
- Attention models for image captioning
- Encoder decoder structure for image captioning
- Setting up Colab and Google Drive
- Loading in the Flickr8k dataset
- Constructing the vocabulary
- Setting up the dataset class
- Implementing utility functions for training data
- Building the encoder CNN
- Building the decoder RNN
- Setting up the sequence to sequence model
- Training the image captioning model
- Loading the dataset and setting up utility functions
- The encoder CNN generating unrolled feature maps
- Implementing Bahdanau attention
- The decoder RNN using attention
- Generating captions using attention
- Training the attention-based image captioning model
- Visualizing the model's attention
- Summary and next steps
Taught by
Janani Ravi
Related Courses
Building Language Models on AWS (Japanese)Amazon Web Services via AWS Skill Builder Building Language Models on AWS (Korean)
Amazon Web Services via AWS Skill Builder Building Language Models on AWS (Simplified Chinese)
Amazon Web Services via AWS Skill Builder Building Language Models on AWS (Traditional Chinese)
Amazon Web Services via AWS Skill Builder Introduction to ChatGPT
edX