YoVDO

Hyper-Optimized Tensor Network Contraction - Simplifications, Applications and Approximations

Offered By: Institute for Pure & Applied Mathematics (IPAM) via YouTube

Tags

Data Science Courses Physical Sciences Courses

Course Description

Overview

Explore tensor network contraction optimization techniques in this 33-minute conference talk from the Tensor Methods and Emerging Applications to the Physical and Data Sciences 2021 workshop. Delve into hyper-optimized methods based on hypergraph partitioning for building efficient contraction trees. Discover a set of powerful tensor network simplifications designed to facilitate easier contraction. Examine applications in quantum circuit simulation and weighted model counting. Gain insights into extending these concepts to approximate contraction. Learn from Johnnie Gray of the California Institute of Technology as he presents advanced strategies for tackling complex tensor network geometries and improving computational efficiency.

Syllabus

Introduction
tensor network
example
contraction tree
hyperindices
partition
partition function
hypergraph partitioning
tensor network simplification
rank simplification
detailed simplifications
low rank decompositions
diagonal hyperindexes
gauge freedom
hybrid reduction
qaoa
weighted model counting
approximate contract
Conclusions


Taught by

Institute for Pure & Applied Mathematics (IPAM)

Related Courses

Data Science Basics
A Cloud Guru
Introduction to Machine Learning
A Cloud Guru
Address Business Issues with Data Science
CertNexus via Coursera
Advanced Clinical Data Science
University of Colorado System via Coursera
Advanced Data Science Capstone
IBM via Coursera