Guide to the LLM Ecosystem: Hugging Face, GPUs, OpenAI, LangChain and More - Lecture 2
Offered By: Data Centric via YouTube
Course Description
Overview
Dive into the complex world of Large Language Models (LLMs) with this comprehensive lecture from the AI Engineering Take-off Course. Gain clarity on key concepts in the LLM ecosystem, including Hugging Face, GPU infrastructure, OpenAI, and LangChain. Explore the fundamentals of how LLMs work, understand the role of different components in the ecosystem, and learn essential knowledge for developing LLM applications. Follow along with detailed chapters covering topics such as infrastructure and hardware, proprietary LLMs, inference servers, app development frameworks, and frontend considerations. Complement your learning with additional resources, including a related blog post and links to other helpful content on AI, Data Science, and LLM development.
Syllabus
Intro:
The Ecosystem:
All about LLMs:
Infrastructure & Hardware:
Hugging Face:
Proprietary LLMs OpenAI:
Inference Server:
App Dev Frameworks:
Frontend:
Taught by
Data Centric
Related Courses
Building Document Intelligence Applications with Azure Applied AI and Azure Cognitive ServicesMicrosoft via YouTube Unlocking the Power of OpenAI for Startups - Microsoft for Startups
Microsoft via YouTube AI Show - Ignite Recap: Arc-Enabled ML, Language Services, and OpenAI
Microsoft via YouTube Building Intelligent Applications with World-Class AI
Microsoft via YouTube Build an AI Image Generator with OpenAI & Node.js
Traversy Media via YouTube