Probabilistic Inference Using Contraction of Tensor Networks
Offered By: The Julia Programming Language via YouTube
Course Description
Overview
Explore probabilistic inference using tensor network contraction in this 29-minute conference talk from JuliaCon 2024. Dive into the world of reasoning under uncertainty and learn how TensorInference.jl, a Julia package, combines probabilistic graphical models (PGMs) with tensor networks to enhance performance in complex probabilistic inference tasks. Discover the challenges of exact and approximate inference methods, and understand how tensor networks offer a powerful solution for representing complex system states. Gain insights into optimizing contraction sequences, leveraging differentiable programming, and utilizing advanced contraction methods like TreeSA, SABipartite, KaHyParBipartite, and GreedyMethod. Learn about the package's support for generic element types, hyper-optimized contraction order settings, and integration with BLAS routines and GPU technology for improved efficiency. Explore applications in AI, medical diagnosis, computer vision, and natural language processing while understanding the potential of exact methods in probabilistic inference.
Syllabus
Probabilistic inference using contraction of tensor networks | Roa-Villescas | JuliaCon 2024
Taught by
The Julia Programming Language
Related Courses
Probabilistic Graphical Models 1: RepresentationStanford University via Coursera Probabilistic Graphical Models 2: Inference
Stanford University via Coursera Probabilistic Graphical Models 3: Learning
Stanford University via Coursera Artificial Intelligence
Udacity Probabilistic Graphical Models
Stanford University via Coursera