Large Language Models with Azure
Offered By: Pragmatic AI Labs via edX
Course Description
Overview
Master Large Language Model Operations on Azure
- Unlock Azure's full potential for deploying & optimizing Large Language Models (LLMs)
- Build robust LLM applications leveraging Azure Machine Learning & OpenAI Service
- Implement architectural patterns & GitHub Actions workflows for streamlined MLOps
Course Highlights:
- Explore Azure AI services and LLM capabilities
- Mitigate risks with foundational strategies
- Leverage Azure ML for model deployment & management
- Optimize GPU quotas for performance & cost-efficiency
- Craft advanced queries for enriched LLM interactions
- Implement Semantic Kernel for enhanced query results
- Dive into architectural patterns like RAG for scalable architectures
- Build end-to-end LLM apps using Azure services & GitHub Actions
Ideal for data professionals, AI enthusiasts & Azure users looking to harness cutting-edge language AI capabilities. Gain practical MLOps skills through tailored modules & hands-on projects.
Syllabus
Week 1: Introduction to LLMOps with Azure
\\- Discover pre-trained LLMs in Azure and deploy basic LLM endpoints
\\- Identify strategies for mitigating risks when using LLMs
\\- Explain how large language models work and their potential benefits and risks
\\- Describe the core Azure services and tools for working with AI solutions like Azure ML and the Azure OpenAI Service
****
Week 2: LLMs with Azure
- Use Azure Machine Learning, including GPU quota management, compute resource creation, model deployment, and utilization of the inference API
- Use the Azure OpenAI Service and its playground by deploying models and creating required resources
- Apply your comprehension of keys, endpoints, and Python examples to integrate Azure OpenAI APIs, monitor usage, and ensure proper resource cleanup
****
Week 3: Extending with Functions and Plugins
- Use Semantic Kernel to create advanced, context-aware prompts for large language models
- Define custom functions to extend system capabilities
- Build a microservice for reusable functions to streamline system extensions
- Implement functions using external APIs and microservices to customize model behavior
****
Week 4: Building an End-to-End LLM application in Azure
- Understand architectural patterns like RAG for building LLM applications
- Use Azure AI Search to create search indexes and embeddings to power RAG
- Build GitHub Actions workflows to automate testing and deployment of LLM apps
- Deploy an end-to-end LLM application leveraging RAG, Azure, and GitHub Actions
Taught by
Noah Gift and Alfredo Deza
Related Courses
DP-100 Part 3 - Deployment and Working with SDKA Cloud Guru AI in Healthcare Capstone
Stanford University via Coursera Amazon SageMaker: Build an Object Detection Model Using Images Labeled with Ground Truth (Simplified Chinese)
Amazon Web Services via AWS Skill Builder Amazon SageMaker : créez un modèle de détection d'objets à l'aide d'images étiquetées avec la vérité du terrain. (Français) | Amazon SageMaker: Build an Object Detection Model Using Images Labeled with Ground Truth (French)
Amazon Web Services via AWS Skill Builder Amazon SageMaker JumpStart で始める生成 AI (Japanese ONLY) (Na) 日本語実写版
Amazon Web Services via AWS Skill Builder