YoVDO

How to Re-Code LLMs Layer by Layer with Tensor Network Substitutions

Offered By: ChemicalQDevice via YouTube

Tags

Tensor Networks Courses Machine Learning Courses GPT-2 Courses Generative AI Courses Explainable AI Courses Model Compression Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the cutting-edge application of tensor network substitutions in Large Language Models (LLMs) through this comprehensive seminar. Delve into the world of matrix product state tensor networks and their role in enhancing the explainability of natural language processing. Discover how tensor networks contribute to model compression, performance improvement, and increased controllability for high-dimensional datasets. Learn about recent advancements in substituting LLM layers with lightweight tensor networks, including Multiverse Computing's compression of LlaMA-2 7B and Terra Quantum's work on GPT-2small. Gain insights into specific strategies for recoding LLMs layer-by-layer using tensor networks, as detailed in current literature. Participate in a live question-and-answer session to deepen your understanding of this innovative approach to AI model optimization.

Syllabus

How to Re-Code LLMs Layer by Layer with Tensor Network Substitutions


Taught by

ChemicalQDevice

Related Courses

Classical Simulation of Quantum Many-body Systems with Tensor Networks
Simons Institute via YouTube
Quantum Circuits, Cellular Automata and Tensor Networks - Ignacio Cirac
Institute for Advanced Study via YouTube
Tensor Networks and Neural Network States - From Chiral Topological Order to Image Classification
APS Physics via YouTube
Bridging Deep Learning and Many-Body Quantum Physics via Tensor Networks
APS Physics via YouTube
Tensor Networks -QC-DMRG- in a Complete Active Space Coupled Cluster Method
Institute for Pure & Applied Mathematics (IPAM) via YouTube