A Connection between Probability, Physics and Neural Networks
Offered By: Alan Turing Institute via YouTube
Course Description
Overview
Explore a novel approach for constructing neural networks that inherently obey physical laws in this lecture from the Alan Turing Institute. Delve into the connection between probability, physics, and neural networks as the speaker illustrates how to exploit this relationship. Begin with a simple single-layer neural network and apply the central limit theorem in the infinite-width limit to achieve a Gaussian output. Investigate the limit network using Gaussian process theory and observe how linear operators, including differential operators defining physical laws, act upon Gaussian processes. Learn how to manipulate the covariance function or kernel to ensure the model obeys physical laws, establishing a physics-consistency condition for Gaussian processes and neural networks. Discover how to construct activation functions that guarantee a priori physics compliance in neural networks, with approximation errors diminishing as network width increases. Examine simple examples of the homogeneous 1D-Helmholtz equation and compare results to naive kernels and activations in this comprehensive 1-hour 11-minute presentation.
Syllabus
Sascha Ranftl - A Connection between Probability, Physics and Neural Network
Taught by
Alan Turing Institute
Related Courses
TensorFlow on Google CloudGoogle Cloud via Coursera Deep Learning Fundamentals with Keras
IBM via edX Intro to TensorFlow em Português Brasileiro
Google Cloud via Coursera TensorFlow on Google Cloud - Français
Google Cloud via Coursera Introduction to Neural Networks and PyTorch
IBM via Coursera