Constructing Physics-Consistent Neural Networks Using Probability Theory
Offered By: Alan Turing Institute via YouTube
Course Description
Overview
Explore a novel approach to constructing neural networks that inherently obey physical laws in this comprehensive lecture from the Alan Turing Institute. Delve into the connection between probability, physics, and neural networks as the speaker begins with a simple single-layer neural network and applies the central limit theorem in the infinite-width limit. Learn how the neural network output becomes Gaussian under certain conditions, allowing for the application of Gaussian process theory. Discover how linear operators, including differential operators defining physical laws, act upon Gaussian processes to yield new Gaussian processes. Examine the concept of physics-consistency for Gaussian processes and its implications for infinite neural networks. Understand how to construct neural networks that obey physical laws by choosing activation functions that match particular kernels in the infinite-width limit. Analyze simple examples of the homogeneous 1D-Helmholtz equation and compare them to naive kernels and activations in this insightful 71-minute presentation.
Syllabus
Sascha Ranftl - A connection between probability, physics and neural network
Taught by
Alan Turing Institute
Related Courses
TensorFlow on Google CloudGoogle Cloud via Coursera Deep Learning Fundamentals with Keras
IBM via edX Intro to TensorFlow em Português Brasileiro
Google Cloud via Coursera TensorFlow on Google Cloud - Français
Google Cloud via Coursera Introduction to Neural Networks and PyTorch
IBM via Coursera