End to End LLMs with Azure
Offered By: Duke University via Coursera
Course Description
Overview
This comprehensive course equips you with skills to leverage Azure for building and deploying Large Language Model (LLM) applications. Learn to use Azure OpenAI Service for deploying LLMs, utilizing inference APIs, and integrating with Python. Explore architectural patterns like Retrieval-Augmented Generation (RAG) and Azure services like Azure Search for robust applications. Gain insights into streamlining deployments with GitHub Actions. Apply your knowledge by implementing RAG with Azure Search, creating GitHub Actions workflows, and deploying end-to-end LLM applications. Develop a deep understanding of Azure's ecosystem for LLM solutions, from model deployment to architectural patterns and deployment pipelines.
Syllabus
- Get Started with LLMs in Azure
- This week, you will explore architectural patterns and deployment of large language model applications. By studying RAG, Azure services, and GitHub Actions, you will learn how to build robust applications. You will apply your learning by implementing RAG with Azure search, creating GitHub Actions workflows, and deploying an end-to-end application.
Taught by
Noah Gift and Alfredo Deza
Tags
Related Courses
Docker Mastery: with Kubernetes +Swarm from a Docker CaptainUdemy Deploy Infra in the Cloud using Terraform
Udemy Integrating Appium into a DevOps Pipeline
Pluralsight Microsoft DevOps Solutions: Designing a Sensitive Information Strategy
Pluralsight Testing and Deploying GatsbyJS Applications: Playbook
Pluralsight