Is It Too Much to Ask for a Stable Baseline? - Evaluation and Monitoring in Machine Learning Systems
Offered By: MLOps World: Machine Learning in Production via YouTube
Course Description
Overview
Explore the challenges of establishing stable baselines in machine learning systems through this 41-minute conference talk from MLOps World: Machine Learning in Production. Delivered by D. Sculley, CEO of Kaggle, delve into the critical role of evaluation and monitoring in reliable ML systems. Examine the difficulties in finding stable reference points, reliable comparison baselines, and effective performance metrics in an environment characterized by changing conditions, feedback loops, and shifting distributions. Investigate how these challenges manifest in traditional settings like click-through prediction and consider their potential impact on emerging fields such as productionized LLMs and generative models.
Syllabus
Is it too much to ask for a stable baseline?
Taught by
MLOps World: Machine Learning in Production
Related Courses
Pensamiento sistémicoUniversidad Nacional Autónoma de México via Coursera Coaching Conversations
University of California, Davis via Coursera Application Monitoring and Feedback Loops
Microsoft via edX Introduction to System Dynamics Modeling
Indian Institute of Technology Bombay via Swayam FoundX Startup School Course
University of Tokyo via Coursera