Large Language Models: Application through Production
Offered By: Databricks via edX
Course Description
Overview
This course is aimed at developers, data scientists, and engineers looking to build LLM-centric applications with the latest and most popular frameworks. You will use Hugging Face to solve natural language processing (NLP) problems, leverage LangChain to perform complex, multi-stage tasks, and deep-dive into prompt engineering. You will use data embeddings and vector databases to augment LLM pipelines. Additionally, you will fine-tune LLMs with domain-specific data to improve performance and cost, as well as identify the benefits and drawbacks of proprietary models. You will assess societal, safety, and ethical considerations of using LLMs. Finally, you will learn how to deploy your models at scale, leveraging LLMOps best practices.
By the end of this course, you will have built an end-to-end LLM workflow that is ready for production!
Syllabus
- Module 1 - Applications with LLMs
- Module 2 - Embeddings, Vector Databases and Search
- Module 3 - Multi-stage Reasoning
- Module 4 - Fine-tuning and Evaluating LLMs
- Module 5 - Society and LLMs: Bias and Safety
- Module 6 - LLMOps
Taught by
Matei Zaharia, Sam Raymond, Chengyin Eng and Joseph Bradley
Related Courses
Amazon SageMaker JumpStart Foundations (Japanese)Amazon Web Services via AWS Skill Builder AWS Flash - Generative AI with Diffusion Models
Amazon Web Services via AWS Skill Builder AWS Flash - Operationalize Generative AI Applications (FMOps/LLMOps)
Amazon Web Services via AWS Skill Builder AWS SimuLearn: Automate Fine-Tuning of an LLM
Amazon Web Services via AWS Skill Builder AWS SimuLearn: Fine-Tune a Base Model with RLHF
Amazon Web Services via AWS Skill Builder