YoVDO

Fine-tuning Optimizations - DoRA, NEFT, LoRA+, and Unsloth

Offered By: Trelis Research via YouTube

Tags

LoRA (Low-Rank Adaptation) Courses Machine Learning Courses Neural Networks Courses Transformers Courses Fine-Tuning Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore advanced fine-tuning optimization techniques for large language models in this comprehensive video tutorial. Delve into the intricacies of LoRA (Low-Rank Adaptation) and its improvements, including DoRA (Double-Rank Adaptation), NEFT (Noisy Embeddings for Fine-Tuning), LoRA+, and Unsloth. Learn how these methods work, their advantages, and practical implementations through detailed explanations and notebook walk-throughs. Compare the effectiveness of each technique and gain insights on choosing the best approach for your fine-tuning needs. Access provided resources, including GitHub repositories, slides, and research papers, to further enhance your understanding and application of these cutting-edge optimization strategies.

Syllabus

Improving on LoRA
Video Overview
How does LoRA work?
Understanding DoRA
NEFT - Adding Noise to Embeddings
LoRA Plus
Unsloth for fine-tuning speedups
Comparing LoRA+, Unsloth, DoRA, NEFT
Notebook Setup and LoRA
DoRA Notebook Walk-through
NEFT Notebook Example
LoRA Plus
Unsloth
Final Recommendation


Taught by

Trelis Research

Related Courses

Introduction to Artificial Intelligence
Stanford University via Udacity
Natural Language Processing
Columbia University via Coursera
Probabilistic Graphical Models 1: Representation
Stanford University via Coursera
Computer Vision: The Fundamentals
University of California, Berkeley via Coursera
Learning from Data (Introductory Machine Learning course)
California Institute of Technology via Independent