YoVDO

How to Re-Code LLMs Layer by Layer with Tensor Network Substitutions

Offered By: ChemicalQDevice via YouTube

Tags

Tensor Networks Courses Machine Learning Courses GPT-2 Courses Generative AI Courses Explainable AI Courses Model Compression Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the cutting-edge application of tensor network substitutions in Large Language Models (LLMs) through this comprehensive seminar. Delve into the world of matrix product state tensor networks and their role in enhancing the explainability of natural language processing. Discover how tensor networks contribute to model compression, performance improvement, and increased controllability for high-dimensional datasets. Learn about recent advancements in substituting LLM layers with lightweight tensor networks, including Multiverse Computing's compression of LlaMA-2 7B and Terra Quantum's work on GPT-2small. Gain insights into specific strategies for recoding LLMs layer-by-layer using tensor networks, as detailed in current literature. Participate in a live question-and-answer session to deepen your understanding of this innovative approach to AI model optimization.

Syllabus

How to Re-Code LLMs Layer by Layer with Tensor Network Substitutions


Taught by

ChemicalQDevice

Related Courses

Explainable AI: Scene Classification and GradCam Visualization
Coursera Project Network via Coursera
Artificial Intelligence Privacy and Convenience
LearnQuest via Coursera
Natural Language Processing and Capstone Assignment
University of California, Irvine via Coursera
Modern Artificial Intelligence Masterclass: Build 6 Projects
Udemy
Data Science for Business
DataCamp