Guide to the LLM Ecosystem: Hugging Face, GPUs, OpenAI, LangChain and More - Lecture 2
Offered By: Data Centric via YouTube
Course Description
Overview
Dive into the complex world of Large Language Models (LLMs) with this comprehensive lecture from the AI Engineering Take-off Course. Gain clarity on key concepts in the LLM ecosystem, including Hugging Face, GPU infrastructure, OpenAI, and LangChain. Explore the fundamentals of how LLMs work, understand the role of different components in the ecosystem, and learn essential knowledge for developing LLM applications. Follow along with detailed chapters covering topics such as infrastructure and hardware, proprietary LLMs, inference servers, app development frameworks, and frontend considerations. Complement your learning with additional resources, including a related blog post and links to other helpful content on AI, Data Science, and LLM development.
Syllabus
Intro:
The Ecosystem:
All about LLMs:
Infrastructure & Hardware:
Hugging Face:
Proprietary LLMs OpenAI:
Inference Server:
App Dev Frameworks:
Frontend:
Taught by
Data Centric
Related Courses
Build a Natural Language Processing Solution with Microsoft AzurePluralsight Challenges and Solutions in Industry Scale Data and AI Systems - Yangqing Jia
Association for Computing Machinery (ACM) via YouTube An AI Engineer Guide to Comet Model Registry Platform
Prodramp via YouTube An AI Engineer Guide to Model Monitoring with Comet ML Platform
Prodramp via YouTube An AI Engineer's Guide to Machine Learning with Keras
Prodramp via YouTube