Inference on KubeEdge
Offered By: Linux Foundation via YouTube
Course Description
Overview
Explore edge computing and machine learning inference with KubeEdge in this insightful conference talk. Discover how to leverage KubeEdge's capabilities for deploying and managing machine learning models at the edge, enabling efficient and low-latency inference. Learn about the integration between Seldon, a popular machine learning deployment platform, and KubeEdge, and gain valuable insights into best practices for implementing inference solutions in edge computing environments. Understand the challenges and opportunities of running machine learning workloads on edge devices, and how KubeEdge addresses these issues to provide a seamless experience for developers and data scientists.
Syllabus
Inference on (the) KubeEdge - Adrian Gonzalez-Martin, Seldon
Taught by
Linux Foundation
Tags
Related Courses
From Ground to Space: Cloud-Native Edge Machine Learning Case Studies with KubeEdge-SednaLinux Foundation via YouTube RecoverMonitor: Edge-enabled Open-source Wearable Devices for Monitoring Healthcare Remote Patient Recovery
Linux Foundation via YouTube Choosing from the Many Flavors of Edge Computing - KubeEdge, OpenYurt, K3S, and K8S
Linux Foundation via YouTube Building Edge AI Stack and AI-as-a-Service in Cloud Native Way
Linux Foundation via YouTube IoT Application Running on KubeEdge and ARM Platform
Linux Foundation via YouTube