MLflow Model Serving - Methods and Best Practices
Offered By: Databricks via YouTube
Course Description
Overview
Explore various methods of model serving with MLflow in this comprehensive one-hour video. Gain insights into both open-source MLflow and Databricks-managed MLflow approaches for serving models. Learn the fundamental differences between batch scoring and real-time scoring, with a particular focus on Databricks' upcoming production-ready model serving. Dive into topics such as prediction overview, vocabulary, online scoring, MLflow Model Server deployment options, container types, and scoring with JSON split-oriented format. Discover end-to-end ML pipeline examples, deployment plugins, and resources for MLflow Ray. Examine Keras/TensorFlow model formats and run model examples. Finally, explore model serving on Databricks, including production-grade model serving and the Databricks Model Serving launch.
Syllabus
Intro
Prediction overview
Vocabulary - Synonyms
Online Scoring
MLflow Model Server deployment options
ML.flow Model Server container types
Score with JSON split-oriented format
Python container
End-to-end ML Pipeline Example with MLflow
MLflow Deployment Plugins
MLflow Deployment Plugin Examples
MLflow Ray Resources
Keras/TensorFlow Model Formats
MLflow Keras/TensorFlow Run Models Example
Model Serving on Databricks
Databricks Production-grade Model Serving
Databricks Model Serving Launch
Taught by
Databricks
Related Courses
Feature EngineeringGoogle Cloud via Coursera TensorFlow on Google Cloud
Google Cloud via Coursera Deep Learning Fundamentals with Keras
IBM via edX Intro to TensorFlow 日本語版
Google Cloud via Coursera Feature Engineering 日本語版
Google Cloud via Coursera