YoVDO

How to Re-Code LLMs Layer by Layer with Tensor Network Substitutions

Offered By: ChemicalQDevice via YouTube

Tags

Tensor Networks Courses Machine Learning Courses GPT-2 Courses Generative AI Courses Explainable AI Courses Model Compression Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the cutting-edge application of tensor network substitutions in Large Language Models (LLMs) through this comprehensive seminar. Delve into the world of matrix product state tensor networks and their role in enhancing the explainability of natural language processing. Discover how tensor networks contribute to model compression, performance improvement, and increased controllability for high-dimensional datasets. Learn about recent advancements in substituting LLM layers with lightweight tensor networks, including Multiverse Computing's compression of LlaMA-2 7B and Terra Quantum's work on GPT-2small. Gain insights into specific strategies for recoding LLMs layer-by-layer using tensor networks, as detailed in current literature. Participate in a live question-and-answer session to deepen your understanding of this innovative approach to AI model optimization.

Syllabus

How to Re-Code LLMs Layer by Layer with Tensor Network Substitutions


Taught by

ChemicalQDevice

Related Courses

TensorFlow Lite for Edge Devices - Tutorial
freeCodeCamp
Few-Shot Learning in Production
HuggingFace via YouTube
TinyML Talks Germany - Neural Network Framework Using Emerging Technologies for Screening Diabetic
tinyML via YouTube
TinyML for All: Full-stack Optimization for Diverse Edge AI Platforms
tinyML via YouTube
TinyML Talks - Software-Hardware Co-design for Tiny AI Systems
tinyML via YouTube