YoVDO

Recurrent Neural Networks Explained Easily

Offered By: Valerio Velardo - The Sound of AI via YouTube

Tags

Neural Network Architecture Courses Time Series Analysis Courses

Course Description

Overview

Learn about the inner workings of Recurrent Neural Networks (RNNs) in this comprehensive 29-minute video tutorial. Explore simple RNN units, time series analysis, and the intricacies of Back Propagation Through Time (BPTT). Dive into topics such as univariate and multivariate time series, RNN architecture, unrolling recurrent layers, and various RNN configurations including sequence-to-vector and sequence-to-sequence. Understand the role of memory cells in simple RNNs, the significance of the tanh activation function, and the mathematical foundations of BPTT. Gain insights into the challenges associated with simple RNNs and prepare for advanced concepts in neural network design.

Syllabus

Intro
Univariate time series
Multivariate time series
Intuition
RNN architecture
Unrolling a recurrent layer
Data shape
Sequence to vector RNN
Sequence to sequence RNN
Memory cell for simple RNN
Why do we use tanh?
Backpropagation through time (BPTT)
The math behind
Issues with simple RNNS
What's up next?


Taught by

Valerio Velardo - The Sound of AI

Related Courses

Policy Analysis Using Interrupted Time Series
The University of British Columbia via edX
Quantitative Finance
Indian Institute of Technology Kanpur via Swayam
Macroeconometric Forecasting
International Monetary Fund via edX
Explaining Your Data Using Tableau
University of California, Davis via Coursera
Time Series Forecasting
Udacity