Lab - Monitor a Model for Data Drift
Offered By: Amazon Web Services via AWS Skill Builder
Course Description
Overview
In this lab, you create a production endpoint with data capture activated, generate baseline statistics and constraints, create a SageMaker Model Monitor baseline processing job, set up CloudWatch alarms for SageMaker Model Monitor, view model performance, and start an automated workflow to retrain the model.
Objectives:
- Activate data capture for a real-time endpoint.
- Generate baseline statistics and constraints.
- Create a SageMaker Model Monitor baseline processing job./li>
- Set up alerts and notification for data drift.
- Start an automatic model retraining workflow.
Prerequisites:
- Basic navigation of the AWS Management Console
- Basic familiarity with Machine Learning concepts
Outline:
- Task 1: Set up the environment
- Task 2: Model monitoring
- Task 3: Review the automatic model retraining workflow
Tags
Related Courses
How to Detect Silent Failures in ML ModelsData Science Dojo via YouTube Dataset Management for Computer Vision - Important Component to Delivering Computer Vision Solutions
Open Data Science via YouTube Testing ML Models in Production - Detecting Data and Concept Drift
Databricks via YouTube Ekya - Continuous Learning of Video Analytics Models on Edge Compute Servers
USENIX via YouTube Building and Maintaining High-Performance AI
Data Science Dojo via YouTube