Inference on KubeEdge
Offered By: Linux Foundation via YouTube
Course Description
Overview
Explore edge computing and machine learning inference with KubeEdge in this insightful conference talk. Discover how to leverage KubeEdge's capabilities for deploying and managing machine learning models at the edge, enabling efficient and low-latency inference. Learn about the integration between Seldon, a popular machine learning deployment platform, and KubeEdge, and gain valuable insights into best practices for implementing inference solutions in edge computing environments. Understand the challenges and opportunities of running machine learning workloads on edge devices, and how KubeEdge addresses these issues to provide a seamless experience for developers and data scientists.
Syllabus
Inference on (the) KubeEdge - Adrian Gonzalez-Martin, Seldon
Taught by
Linux Foundation
Tags
Related Courses
Developing a Tabular Data ModelMicrosoft via edX Data Science in Action - Building a Predictive Churn Model
SAP Learning Serverless Machine Learning with Tensorflow on Google Cloud Platform 日本語版
Google Cloud via Coursera Intro to TensorFlow em Português Brasileiro
Google Cloud via Coursera Serverless Machine Learning con TensorFlow en GCP
Google Cloud via Coursera