LaMini-LM - Mini Models Maxi Data
Offered By: Sam Witteveen via YouTube
Course Description
Overview
Explore the creation of LaMini-LM, a collection of distilled language models trained on large-scale instructions, in this informative video. Dive into the key ideas, dataset creation process, and model training methodology outlined in the research paper. Examine the diverse range of models trained, including Neo 1.3B, GPT1.5B, and Flan-T5-783M. Learn about the Hugging Face dataset used and witness demonstrations of prompts on ChatGPT. Gain practical insights through code examples and access provided Colab notebooks for hands-on experimentation with these mini models trained on maxi data.
Syllabus
Intro
Key Idea
Diagram
Dataset
Hugging Face Dataset
Trained on a lot of Models
Paper
Prompts on ChatGPT
Code Time
Taught by
Sam Witteveen
Related Courses
How Google does Machine Learning en EspaƱolGoogle Cloud via Coursera Creating Custom Callbacks in Keras
Coursera Project Network via Coursera Automatic Machine Learning with H2O AutoML and Python
Coursera Project Network via Coursera AI in Healthcare Capstone
Stanford University via Coursera AutoML con Pycaret y TPOT
Coursera Project Network via Coursera