YoVDO

Generative AI and LLMs on AWS

Offered By: Pragmatic AI Labs via edX

Tags

Amazon Web Services (AWS) Courses Compliance Courses Generative AI Courses Differential Privacy Courses Cost Optimization Courses CI/CD Pipelines Courses Auto-scaling Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!

Master deploying generative AI models like GPT on AWS through hands-on labs. Learn architecture selection, cost optimization, monitoring, CI/CD pipelines, and compliance best practices. Gain skills in operationalizing LLMs using Amazon Bedrock, auto-scaling, spot instances, and differential privacy techniques. Ideal for ML engineers, data scientists, and technical leaders.

Course Highlights:

  • Choose optimal LLM architectures for your applications
  • Optimize cost, performance and scalability with auto-scaling and orchestration
  • Monitor LLM metrics and continuously improve model quality
  • Build secure CI/CD pipelines to train, deploy and update LLMs
  • Ensure regulatory compliance via differential privacy and controlled rollouts
  • Real-world, hands-on training for production-ready generative AI

Unlock the power of large language models on AWS. Master operationalization using cloud-native services through this comprehensive, practical training program.


Syllabus

Week 1: Getting Started with Developing on AWS for AI

****

  • Introduction to AWS Cloud Computing for AI, including the AWS Cloud Adoption Framework

  • Setting up AI-focused development environments using AWS services like Cloud9, SageMaker, and Lightsail

  • Developing serverless solutions for data, ML, and AI using AWS Bedrock and Rust

****

Week 2: AI Pair Programming from CodeWhisperer to Prompt Engineering

****

  • Learning prompt engineering techniques to guide large language models

  • Using AWS CodeWhisperer as an AI pair programming assistant

  • Leveraging CodeWhisperer CLI to automate tasks and build efficient Bash scripts

****

Week 3: Amazon Bedrock

****

  • Key capabilities and components of Amazon Bedrock

  • Accessing and invoking Bedrock foundation models using AWS CLI, Boto3 Python SDK, and Rust SDK

  • Prompt engineering and model evaluation to optimize Bedrock model performance

  • Customizing models with fine-tuning and knowledge bases

****

Week 4: Project Challenges

****

  • Applying course concepts to build an end-to-end AI workflow

  • Developing Rust functions for Bedrock agents and integrating into an orchestration flow

  • Debugging, benchmarking, and prompt engineering to optimize a deployed AI application on AWS

****

By the end of this course, you will have gained hands-on experience with cutting-edge AI/ML tools on AWS like Bedrock, CodeWhisperer, and Rust. You'll be able to build and deploy efficient, serverless AI applications in production.


Taught by

Noah Gift and Alfredo Deza

Related Courses

Communicating Data Science Results
University of Washington via Coursera
Cloud Computing Applications, Part 2: Big Data and Applications in the Cloud
University of Illinois at Urbana-Champaign via Coursera
Cloud Computing Infrastructure
University System of Maryland via edX
Google Cloud Platform for AWS Professionals
Google via Coursera
Introduction to Apache Spark and AWS
University of London International Programmes via Coursera