YoVDO

Few-Shot Learning in Production

Offered By: HuggingFace via YouTube

Tags

Few-shot Learning Courses Quantization Courses Model Deployment Courses Language Models Courses Model Compression Courses Sentence Transformers Courses

Course Description

Overview

Explore few-shot learning techniques for production environments in this comprehensive workshop presented by researchers from Hugging Face and Intel AI. Learn to train Sentence Transformers using SetFit, a powerful technique for scenarios with limited labeled data. Discover methods for compressing models through knowledge distillation and accelerating inference using quantization with šŸ¤— Optimum and IntelĀ® Neural Compressor. Gain insights into deploying models efficiently with Inference Endpoints. Dive into topics such as Fuchsia, TPU, benchmarks, use cases, and various training techniques. Follow along with practical demonstrations using notebooks and explore real-world applications in random news scenarios. By the end of this 1 hour and 22 minute session, acquire valuable skills for implementing few-shot learning in production environments.

Syllabus

Introduction
What is Fuchsia
TPU
Setfit
Summary
Benchmarks
Use cases
Techniques
Knowledge distillation
Quantization
Training Sets
Random News
Notebook
Deployment


Taught by

Hugging Face

Related Courses

Semantic Search for AI - Testing Out Qdrant Neural Search
David Shapiro ~ AI via YouTube
How to Use OpenAI Whisper to Fix YouTube Search
James Briggs via YouTube
Spotify's Podcast Search Explained
James Briggs via YouTube
Is GPL the Future of Sentence Transformers - Generative Pseudo-Labeling Deep Dive
James Briggs via YouTube
Train Sentence Transformers by Generating Queries - GenQ
James Briggs via YouTube