YoVDO

Extinguishing the Garbage Fire of ML Testing - Improving Reliability and Quality in Machine Learning

Offered By: Data Council via YouTube

Tags

MLOps Courses Data Science Courses Quality Assurance Courses pytest Courses Software Engineering Courses Continuous Integration Courses Observability Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore innovative approaches to machine learning testing in this 22-minute conference talk from Data Council. Discover how to extinguish the "garbage fire" of traditional ML testing methods by learning about abstracting, decoupling, and separating concerns, limiting pytest usage, leveraging observability, and applying data reliability practices. Gain insights on honoring data scientists' mental models and working styles to improve ML testing efficiency. Delve into topics such as testing probabilistic code, pursuing reliability for business value, implementing pre-prod environments, and utilizing ML observability. Consider the controversial idea of data scientists participating in on-call rotations. Learn from Emily Curtin, a Staff MLOps Engineer at Intuit Mailchimp, as she shares strategies for helping data scientists produce higher quality work more quickly and intuitively.

Syllabus

Intro
ML Testing is a garbage fire
Testing Probabilistic Code
Why Test?
Pursuing Reliability for Business Value
Pre-prod Environments
ML Observability
Production Readiness Score
Hot Take: Data Scientists should have an on call rotation
Better ML Reliability through...


Taught by

Data Council

Related Courses

Machine Learning Operations (MLOps): Getting Started
Google Cloud via Coursera
Проектирование и реализация систем машинного обучения
Higher School of Economics via Coursera
Demystifying Machine Learning Operations (MLOps)
Pluralsight
Machine Learning Engineer with Microsoft Azure
Microsoft via Udacity
Machine Learning Engineering for Production (MLOps)
DeepLearning.AI via Coursera