Hyper-Optimized Tensor Network Contraction - Simplifications, Applications and Approximations
Offered By: Institute for Pure & Applied Mathematics (IPAM) via YouTube
Course Description
Overview
Explore tensor network contraction optimization techniques in this 33-minute conference talk from the Tensor Methods and Emerging Applications to the Physical and Data Sciences 2021 workshop. Delve into hyper-optimized methods based on hypergraph partitioning for building efficient contraction trees. Discover a set of powerful tensor network simplifications designed to facilitate easier contraction. Examine applications in quantum circuit simulation and weighted model counting. Gain insights into extending these concepts to approximate contraction. Learn from Johnnie Gray of the California Institute of Technology as he presents advanced strategies for tackling complex tensor network geometries and improving computational efficiency.
Syllabus
Introduction
tensor network
example
contraction tree
hyperindices
partition
partition function
hypergraph partitioning
tensor network simplification
rank simplification
detailed simplifications
low rank decompositions
diagonal hyperindexes
gauge freedom
hybrid reduction
qaoa
weighted model counting
approximate contract
Conclusions
Taught by
Institute for Pure & Applied Mathematics (IPAM)
Related Courses
Scientific ComputingUniversity of Washington via Coursera Inquiry Science Learning: Perspectives and Practices 3 - Science Content Survey
Rice University via Coursera Philosophy and the Sciences: Introduction to the Philosophy of Physical Sciences
University of Edinburgh via Coursera Natural Sciences
Modern States via Independent A mathematical way to think about biology
Udemy