YoVDO

Fine-tuning Optimizations - DoRA, NEFT, LoRA+, and Unsloth

Offered By: Trelis Research via YouTube

Tags

LoRA (Low-Rank Adaptation) Courses Machine Learning Courses Neural Networks Courses Transformers Courses Fine-Tuning Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore advanced fine-tuning optimization techniques for large language models in this comprehensive video tutorial. Delve into the intricacies of LoRA (Low-Rank Adaptation) and its improvements, including DoRA (Double-Rank Adaptation), NEFT (Noisy Embeddings for Fine-Tuning), LoRA+, and Unsloth. Learn how these methods work, their advantages, and practical implementations through detailed explanations and notebook walk-throughs. Compare the effectiveness of each technique and gain insights on choosing the best approach for your fine-tuning needs. Access provided resources, including GitHub repositories, slides, and research papers, to further enhance your understanding and application of these cutting-edge optimization strategies.

Syllabus

Improving on LoRA
Video Overview
How does LoRA work?
Understanding DoRA
NEFT - Adding Noise to Embeddings
LoRA Plus
Unsloth for fine-tuning speedups
Comparing LoRA+, Unsloth, DoRA, NEFT
Notebook Setup and LoRA
DoRA Notebook Walk-through
NEFT Notebook Example
LoRA Plus
Unsloth
Final Recommendation


Taught by

Trelis Research

Related Courses

Neural Networks for Machine Learning
University of Toronto via Coursera
Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn
Statistical Learning with R
Stanford University via edX
Machine Learning 1—Supervised Learning
Brown University via Udacity
Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX