Open Assistant Inference Backend Development
Offered By: Yannic Kilcher via YouTube
Course Description
Overview
Dive into a hands-on coding session focused on building streaming inference into the Hugging Face text generation server. Explore various technologies including CUDA, Python, Rust, gRPC, WebSockets, and server-sent events. Follow along as the development process unfolds, covering the integration of Open Assistant with the Hugging Face infrastructure. Gain insights into MLOps practices and learn how to implement advanced features in AI-powered text generation systems. Access the original Hugging Face text generation inference repository and the Open Assistant repository for reference during the coding session. Discover free MLOps courses to further enhance your skills in machine learning operations.
Syllabus
Open Assistant Inference Backend Development (Hands-On Coding)
Taught by
Yannic Kilcher
Related Courses
Разработка веб-сервисов на Golang, часть 2Moscow Institute of Physics and Technology via Coursera Managing Cloud Run gRPC Services with API Gateway
Google Cloud via Coursera Beginner's Guide to Go Protocol Buffer
LinkedIn Learning Building Java Microservices with gRPC
LinkedIn Learning gRPC in Python
LinkedIn Learning