YoVDO

NYU Deep Learning

Offered By: YouTube

Tags

Deep Learning Courses Unsupervised Learning Courses Neural Networks Courses PyTorch Courses Self-supervised Learning Courses Autoencoders Courses

Course Description

Overview

Dive into the world of deep learning with this comprehensive NYU course. Explore the history and foundations of neural networks, including gradient descent and backpropagation. Master essential concepts like convolutional and recurrent neural networks, and gain hands-on experience with PyTorch implementations. Delve into advanced topics such as energy-based models, self-supervised learning, and variational inference. Discover the applications of deep learning in computer vision, speech recognition, and natural language processing. Learn about graph convolutional networks, transformers, and attention mechanisms. Tackle optimization techniques for deep learning and explore planning and control under uncertainty. Conclude with insights into Lagrangian backpropagation and participate in a Q&A session. This course offers a thorough understanding of deep learning principles and their practical applications in various domains.

Syllabus

01 – History and resources.
01L – Gradient descent and the backpropagation algorithm.
02 – Neural nets: rotation and squashing.
02L – Modules and architectures.
03 – Tools, classification with neural nets, PyTorch implementation.
03L – Parameter sharing: recurrent and convolutional nets.
04L – ConvNet in practice.
04.1 – Natural signals properties and the convolution.
04.2 – Recurrent neural networks, vanilla and gated (LSTM).
05L – Joint embedding method and latent variable energy based models (LV-EBMs).
05.1 – Latent Variable Energy Based Models (LV-EBMs), inference.
05.2 – But what are these EBMs used for?.
06L – Latent variable EBMs for structured prediction.
06 – Latent Variable Energy Based Models (LV-EBMs), training.
07L – PCA, AE, K-means, Gaussian mixture model, sparse coding, and intuitive VAE.
07 – Unsupervised learning: autoencoding the targets.
08L – Self-supervised learning and variational inference.
08 – From LV-EBM to target prop to (vanilla, denoising, contractive, variational) autoencoder.
09L – Differentiable associative memories, attention, and transformers.
09 – AE, DAE, and VAE with PyTorch; generative adversarial networks (GAN) and code.
10L – Self-supervised learning in computer vision.
10 – Self / cross, hard / soft attention and the Transformer.
11L – Speech recognition and Graph Transformer Networks.
11 – Graph Convolutional Networks (GCNs).
12L – Low resource machine translation.
12 – Planning and control.
13L – Optimisation for Deep Learning.
13 – The Truck Backer-Upper.
14L – Lagrangian backpropagation, final project winners, and Q&A session.
14 – Prediction and Planning Under Uncertainty.
AI2S Xmas Seminar - Dr. Alfredo Canziani (NYU) - Energy-Based Self-Supervised Learning.


Taught by

Alfredo Canziani

Related Courses

Продвинутые методы машинного обучения
Higher School of Economics via Coursera
Advanced Machine Learning and Signal Processing
IBM via Coursera
Applied Data Science for Data Analysts
Databricks via Coursera
Aprendizaje Automático con Python
IBM via Coursera
Aprendizaje de máquinas
Universidad Nacional Autónoma de México via Coursera