YoVDO

Automated Machine Learning Performance Evaluation

Offered By: CNCF [Cloud Native Computing Foundation] via YouTube

Tags

Conference Talks Courses Kubernetes Courses Argo Courses Benchmarking Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore automated machine learning performance evaluation in this 26-minute conference talk from KubeCon + CloudNativeCon North America 2021. Dive into the intricacies of benchmarking deployed production machine learning models in cloud native infrastructure. Learn about the theory behind ML model benchmarking, including key parameters like latency, throughput, and performance percentiles. Follow a hands-on example using Argo, Kubernetes, and Seldon Core to benchmark a model across multiple parameters for optimal hardware performance. Gain insights into workflow management, reusability, and best practices for evaluating ML models in various deployment scenarios.

Syllabus

Introduction
C410 classifier
What is seldomcore
Deploying a model
Extra complexity
Best practices
Benchmark types
Benchmark tools
Automating the evaluation
Workflow managers
Workflows
argo workflow
reusability
output
resources
Wrap up


Taught by

CNCF [Cloud Native Computing Foundation]

Related Courses

Advanced R Programming
Johns Hopkins University via Coursera
Analyse comparative des volumes Amazon EBS (Français) | Benchmarking Amazon EBS Volumes (French)
Amazon Web Services via AWS Skill Builder
Analyzing the Internal/External Competitive Profile Matrix
Coursera Project Network via Coursera
Assessing Cultural Climate
Rice University via Coursera
AWS Foundations: Cost Management (Simplified Chinese)
Amazon Web Services via AWS Skill Builder