Scaling AI Applications with Ray - Richard Liaw & Eric Liang | ODSC East 2019
Offered By: Open Data Science via YouTube
Course Description
Overview
Explore scaling AI applications using Ray in this conference talk from ODSC East 2019. Learn about Ray's high-performance distributed execution engine and its libraries for AI workloads from Richard Liaw and Eric Liang of UC Berkeley's RISELab. Discover how Ray's API enables seamless scaling from interactive development to production clusters, covering Tune for hyperparameter optimization and RLib for reinforcement learning. Gain insights into Ray's architecture, use cases, and performance benefits for developing next-generation AI applications that continuously interact with and learn from their environment.
Syllabus
Preparation
The Big Picture
A Growing Number of Use Cases
Ray API
Ray Architecture
What is Tune?
Why a framework for tuning hyperparameters?
Tune is built with Deep Learning as a priority.
Tune is simple to use.
What is RLlib?
Background: What is reinforcement learning?
Growing number of RL applications
A scalable, unified library for reinforcement learning
Reference Algorithms
Performance
Exercises
Taught by
Open Data Science
Related Courses
Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and OptimizationDeepLearning.AI via Coursera How to Win a Data Science Competition: Learn from Top Kagglers
Higher School of Economics via Coursera Predictive Modeling and Machine Learning with MATLAB
MathWorks via Coursera Machine Learning Rapid Prototyping with IBM Watson Studio
IBM via Coursera Hyperparameter Tuning with Neural Network Intelligence
Coursera Project Network via Coursera