Building Edge AI Stack and AI-as-a-Service in Cloud Native Way
Offered By: Linux Foundation via YouTube
Course Description
Overview
Explore the architecture and implementation of edge AI stack and AI-as-a-Service in a cloud-native environment. Delve into the KubeEdge architecture, Akraino KubeEdge Edge Service Blueprint, and ML offloading functional block diagram. Examine use cases for device app ML model inference offloading workflows and address edge AI challenges. Learn about the KubeEdge-AIService architecture and edge-cloud collaborative techniques such as joint inference, incremental learning, and federated learning. Gain insights into developer perspectives on joint inference and federated learning, as well as resource information for building robust edge AI solutions.
Syllabus
Intro
Problem and Challenges
Background
KubeEdge Architecture
Akraino KubeEdge Edge Service Blueprint
KubeEdge ML Offloading Functional Block Diagram
Use Case: Device App ML model inference offloading workflow
Edge Al Challenges
KubeEdge-AI
Service Architecture
Edge-cloud Collaborative JOINT INFERENCE Improve the inference performance, when edge resources are imited
Edge-cloud Collaborative INCREMENTAL LEARNING The more models are used the smarter they are
Edge-cloud collaborative FEDERATED LEARNING Raw data is not transmitted out of the edge, and the model is generated by
Developer perspective: JOINT INFERENCE
Developer perspective: FEDERATED LEARNING
Resource Information
Taught by
Linux Foundation
Tags
Related Courses
From Ground to Space: Cloud-Native Edge Machine Learning Case Studies with KubeEdge-SednaLinux Foundation via YouTube RecoverMonitor: Edge-enabled Open-source Wearable Devices for Monitoring Healthcare Remote Patient Recovery
Linux Foundation via YouTube Choosing from the Many Flavors of Edge Computing - KubeEdge, OpenYurt, K3S, and K8S
Linux Foundation via YouTube Inference on KubeEdge
Linux Foundation via YouTube IoT Application Running on KubeEdge and ARM Platform
Linux Foundation via YouTube