Open Assistant Inference Backend Development
Offered By: Yannic Kilcher via YouTube
Course Description
Overview
Dive into a hands-on coding session focused on building streaming inference into the Hugging Face text generation server. Explore various technologies including CUDA, Python, Rust, gRPC, WebSockets, and server-sent events. Follow along as the development process unfolds, covering the integration of Open Assistant with the Hugging Face infrastructure. Gain insights into MLOps practices and learn how to implement advanced features in AI-powered text generation systems. Access the original Hugging Face text generation inference repository and the Open Assistant repository for reference during the coding session. Discover free MLOps courses to further enhance your skills in machine learning operations.
Syllabus
Open Assistant Inference Backend Development (Hands-On Coding)
Taught by
Yannic Kilcher
Related Courses
Hugging Face on Azure - Partnership and Solutions AnnouncementMicrosoft via YouTube Question Answering in Azure AI - Custom and Prebuilt Solutions - Episode 49
Microsoft via YouTube Open Source Platforms for MLOps
Duke University via Coursera Masked Language Modelling - Retraining BERT with Hugging Face Trainer - Coding Tutorial
rupert ai via YouTube Masked Language Modelling with Hugging Face - Microsoft Sentence Completion - Coding Tutorial
rupert ai via YouTube