YoVDO

Probabilistic Deep Learning with TensorFlow 2

Offered By: Imperial College London via Coursera

Tags

TensorFlow Courses Deep Learning Courses Probability Distributions Courses Uncertainty Quantification Courses Generative Modeling Courses Variational Autoencoders Courses

Course Description

Overview

Welcome to this course on Probabilistic Deep Learning with TensorFlow! This course builds on the foundational concepts and skills for TensorFlow taught in the first two courses in this specialisation, and focuses on the probabilistic approach to deep learning. This is an increasingly important area of deep learning that aims to quantify the noise and uncertainty that is often present in real world datasets. This is a crucial aspect when using deep learning models in applications such as autonomous vehicles or medical diagnoses; we need the model to know what it doesn't know. You will learn how to develop probabilistic models with TensorFlow, making particular use of the TensorFlow Probability library, which is designed to make it easy to combine probabilistic models with deep learning. As such, this course can also be viewed as an introduction to the TensorFlow Probability library. You will learn how probability distributions can be represented and incorporated into deep learning models in TensorFlow, including Bayesian neural networks, normalising flows and variational autoencoders. You will learn how to develop models for uncertainty quantification, as well as generative models that can create new samples similar to those in the dataset, such as images of celebrity faces. You will put concepts that you learn about into practice straight away in practical, hands-on coding tutorials, which you will be guided through by a graduate teaching assistant. In addition there is a series of automatically graded programming assignments for you to consolidate your skills. At the end of the course, you will bring many of the concepts together in a Capstone Project, where you will develop a variational autoencoder algorithm to produce a generative model of a synthetic image dataset that you will create yourself. This course follows on from the previous two courses in the specialisation, Getting Started with TensorFlow 2 and Customising Your Models with TensorFlow 2. The additional prerequisite knowledge required in order to be successful in this course is a solid foundation in probability and statistics. In particular, it is assumed that you are familiar with standard probability distributions, probability density functions, and concepts such as maximum likelihood estimation, change of variables formula for random variables, and the evidence lower bound (ELBO) used in variational inference.

Syllabus

  • TensorFlow Distributions
    • Probabilistic modelling is a powerful and principled approach that provides a framework in which to take account of uncertainty in the data. The TensorFlow Probability (TFP) library provides tools for developing probabilistic models that extend the capability of TensorFlow. In this first week of the course, you will learn how to use the Distribution objects in TFP, and the key methods to sample from and compute probabilities from these distributions. You will also learn how to make these distributions trainable. The programming assignment or this week will put these techniques into practice by implementing a Naive Bayes classifier on the Iris dataset.
  • Probabilistic layers and Bayesian neural networks
    • Accounting for sources of uncertainty is an important aspect of the modelling process, especially for safety-critical applications such as medical diagnoses. Most standard deep learning models do not quantify the uncertainty in their predictions. In this week you will learn how to use probabilistic layers from TensorFlow Probability to develop deep learning models that are able to provide measures of uncertainty in both the data, and the model itself. In the programming assignment for this week, you will develop a Bayesian CNN for the MNIST and MNIST-C datasets.
  • Bijectors and normalising flows
    • Normalising flows are a powerful class of generative models, that aim to model the underlying data distribution by transforming a simple base distribution through a series of bijective transformations. In this week you will learn how to use bijector objects from the TensorFlow Probability library to implement these transformations, and learn a complex transformed distribution from data. These models can be used to sample new data generations, as well as evaluate the likelihood of data examples. In the programming assignment for this week, you will develop a RealNVP normalising flow model for the LSUN bedroom dataset.
  • Variational autoencoders
    • Variational autoencoders are one of the most popular types of likelihood-based generative deep learning models. In the VAE algorithm two networks are jointly learned: an encoder or inference network, as well as a decoder or generative network. In this week you will learn how to implement the VAE using the TensorFlow Probability library. You will then use the trained networks to encode data examples into a compressed latent space, as well as generate new samples from the prior distribution and the decoder. In the programming assignment for this week, you will develop the variational autoencoder for an image dataset of celebrity faces.
  • Capstone Project
    • In this course you have learned how to develop probabilistic deep learning models using tools and concepts from the TensorFlow Probability library such as Distribution objects, probabilistic layers, bijectors, and KL divergence optimisation. The Capstone Project brings many of these concepts together with a task to create a synthetic image dataset using normalising flows, and train a variational autoencoder on the dataset.

Taught by

Dr Kevin Webster

Tags

Related Courses

Исследование и генерация данных для принятия бизн.-реш.
University of Illinois at Urbana-Champaign via Coursera
What are the Chances? Probability and Uncertainty in Statistics
Johns Hopkins University via Coursera
Probability
Codecademy
Random Walks
Santa Fe Institute via Complexity Explorer
Introduction to Data Science and Basic Statistics for Business
Tecnológico de Monterrey via edX