Guide to the LLM Ecosystem: Hugging Face, GPUs, OpenAI, LangChain and More - Lecture 2
Offered By: Data Centric via YouTube
Course Description
Overview
Dive into the complex world of Large Language Models (LLMs) with this comprehensive lecture from the AI Engineering Take-off Course. Gain clarity on key concepts in the LLM ecosystem, including Hugging Face, GPU infrastructure, OpenAI, and LangChain. Explore the fundamentals of how LLMs work, understand the role of different components in the ecosystem, and learn essential knowledge for developing LLM applications. Follow along with detailed chapters covering topics such as infrastructure and hardware, proprietary LLMs, inference servers, app development frameworks, and frontend considerations. Complement your learning with additional resources, including a related blog post and links to other helpful content on AI, Data Science, and LLM development.
Syllabus
Intro:
The Ecosystem:
All about LLMs:
Infrastructure & Hardware:
Hugging Face:
Proprietary LLMs OpenAI:
Inference Server:
App Dev Frameworks:
Frontend:
Taught by
Data Centric
Related Courses
Prompt Templates for GPT-3.5 and Other LLMs - LangChainJames Briggs via YouTube Getting Started with GPT-3 vs. Open Source LLMs - LangChain
James Briggs via YouTube Chatbot Memory for Chat-GPT, Davinci + Other LLMs - LangChain
James Briggs via YouTube Chat in LangChain
James Briggs via YouTube LangChain Data Loaders, Tokenizers, Chunking, and Datasets - Data Prep
James Briggs via YouTube