Inference on KubeEdge
Offered By: Linux Foundation via YouTube
Course Description
Overview
Explore edge computing and machine learning inference with KubeEdge in this insightful conference talk. Discover how to leverage KubeEdge's capabilities for deploying and managing machine learning models at the edge, enabling efficient and low-latency inference. Learn about the integration between Seldon, a popular machine learning deployment platform, and KubeEdge, and gain valuable insights into best practices for implementing inference solutions in edge computing environments. Understand the challenges and opportunities of running machine learning workloads on edge devices, and how KubeEdge addresses these issues to provide a seamless experience for developers and data scientists.
Syllabus
Inference on (the) KubeEdge - Adrian Gonzalez-Martin, Seldon
Taught by
Linux Foundation
Tags
Related Courses
Introduction to AWS Inferentia and Amazon EC2 Inf1 InstancesPluralsight Introduction to AWS Inferentia and Amazon EC2 Inf1 Instances (Korean)
Amazon Web Services via AWS Skill Builder Introduction to Amazon Elastic Inference
Amazon Web Services via AWS Skill Builder TensorFlow Lite - Solution for Running ML On-Device
TensorFlow via YouTube Deep Learning Neural Network Acceleration at the Edge
Linux Foundation via YouTube