How to Re-Code LLMs Layer by Layer with Tensor Network Substitutions
Offered By: ChemicalQDevice via YouTube
Course Description
Overview
Explore the cutting-edge application of tensor network substitutions in Large Language Models (LLMs) through this comprehensive seminar. Delve into the world of matrix product state tensor networks and their role in enhancing the explainability of natural language processing. Discover how tensor networks contribute to model compression, performance improvement, and increased controllability for high-dimensional datasets. Learn about recent advancements in substituting LLM layers with lightweight tensor networks, including Multiverse Computing's compression of LlaMA-2 7B and Terra Quantum's work on GPT-2small. Gain insights into specific strategies for recoding LLMs layer-by-layer using tensor networks, as detailed in current literature. Participate in a live question-and-answer session to deepen your understanding of this innovative approach to AI model optimization.
Syllabus
How to Re-Code LLMs Layer by Layer with Tensor Network Substitutions
Taught by
ChemicalQDevice
Related Courses
Generating New Recipes using GPT-2Coursera Project Network via Coursera Deep Learning NLP: Training GPT-2 from scratch
Coursera Project Network via Coursera Artificial Creativity
Parsons School of Design via Coursera Coding Train Late Night - GPT-2, Hue Lights, Discord Bot
Coding Train via YouTube Coding Train Late Night - Fetch, GPT-2 and RunwayML
Coding Train via YouTube