Guide to the LLM Ecosystem: Hugging Face, GPUs, OpenAI, LangChain and More - Lecture 2
Offered By: Data Centric via YouTube
Course Description
Overview
Dive into the complex world of Large Language Models (LLMs) with this comprehensive lecture from the AI Engineering Take-off Course. Gain clarity on key concepts in the LLM ecosystem, including Hugging Face, GPU infrastructure, OpenAI, and LangChain. Explore the fundamentals of how LLMs work, understand the role of different components in the ecosystem, and learn essential knowledge for developing LLM applications. Follow along with detailed chapters covering topics such as infrastructure and hardware, proprietary LLMs, inference servers, app development frameworks, and frontend considerations. Complement your learning with additional resources, including a related blog post and links to other helpful content on AI, Data Science, and LLM development.
Syllabus
Intro:
The Ecosystem:
All about LLMs:
Infrastructure & Hardware:
Hugging Face:
Proprietary LLMs OpenAI:
Inference Server:
App Dev Frameworks:
Frontend:
Taught by
Data Centric
Related Courses
Hugging Face on Azure - Partnership and Solutions AnnouncementMicrosoft via YouTube Question Answering in Azure AI - Custom and Prebuilt Solutions - Episode 49
Microsoft via YouTube Open Source Platforms for MLOps
Duke University via Coursera Masked Language Modelling - Retraining BERT with Hugging Face Trainer - Coding Tutorial
rupert ai via YouTube Masked Language Modelling with Hugging Face - Microsoft Sentence Completion - Coding Tutorial
rupert ai via YouTube