Scaling Up AI Research to Production with PyTorch and MLFlow
Offered By: Databricks via YouTube
Course Description
Overview
Explore the latest advancements in PyTorch and MLFlow for scaling AI research to production in this 44-minute video presentation by Databricks. Dive deep into crucial developments, including model parallel distributed training, model optimization, and on-device deployment. Learn about the newest libraries supporting production-scale deployment in conjunction with MLFlow. Discover how PyTorch's evolution since version 1.0 has accelerated the workflow from research to production. Gain insights into topics such as simplicity over complexity, community involvement, papers with code, challenges in AI development, and code walkthroughs. Understand the importance of model size and compute needs, exploring techniques like pruning and quantization. Examine strategies for training models at scale, deploying on heterogeneous hardware, and managing large models. Delve into remote procedure calls, API overviews, and deployment at scale using PyTorch Service and MLFlow. Stay updated on PyTorch's latest features and domain-specific libraries. Find resources for further education, including books and channels, to enhance your AI research and production skills.
Syllabus
Introduction
Agenda
Simplicity over Complexity
Community
Papers with Code Calm
Facebook
Challenges
Dev Acts
Code Walkthrough
PyTorch Libraries
Model Size and Compute Needs
Pruning
Quantization
Quantization API
Quantization Results
Training Models at Scale
Deploy Heterogeneous Hardware
Adhoc Jobs
PyTorch Elastic
Large Models
Remote Procedure Call
API Overview
Deployment at Scale
PyTorch Service
MLFlow
PyTorch Update
Domain Libraries
Getting Educated
Books
Channels
Taught by
Databricks
Related Courses
Bayes Classifier on DataprocGoogle via Google Cloud Skills Boost Llama for Python Programmers
University of Michigan via Coursera Quantization Fundamentals with Hugging Face
DeepLearning.AI via Coursera Quantization in Depth
DeepLearning.AI via Coursera Working with Llama 3
DataCamp