Streaming for LangChain Agents and FastAPI
Offered By: James Briggs via YouTube
Course Description
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Learn how to implement streaming for LangChain Agents and serve it through FastAPI in this comprehensive 28-minute tutorial. Progress from basic LangChain streaming to advanced techniques, including simple terminal streaming with LLMs, parsing stream outputs using Async Iterator streaming, and integrating with OpenAI's GPT-3.5-turbo model via LangChain's ChatOpenAI object. Explore custom callback handlers, FastAPI integration, and essential considerations for deploying streaming in production. Access accompanying code notebooks and FastAPI template code to enhance your learning experience and quickly apply these concepts in real-world scenarios.
Syllabus
Streaming for LLMs and Agents
Simple StdOut Streaming in LangChain
Streaming with LangChain Agents
Final Output Streaming
Custom Callback Handlers in LangChain
FastAPI with LangChain Agent Streaming
Confirming we have Agent Streaming
Custom Callback Handlers for Async
Final Things to Consider
Taught by
James Briggs
Related Courses
AWS Certified Machine Learning - Specialty (LA)A Cloud Guru Google Cloud AI Services Deep Dive
A Cloud Guru Introduction to Machine Learning
A Cloud Guru Deep Learning and Python Programming for AI with Microsoft Azure
Cloudswyft via FutureLearn Advanced Artificial Intelligence on Microsoft Azure: Deep Learning, Reinforcement Learning and Applied AI
Cloudswyft via FutureLearn