Building Edge AI Stack and AI-as-a-Service in Cloud Native Way
Offered By: Linux Foundation via YouTube
Course Description
Overview
Explore the architecture and implementation of edge AI stack and AI-as-a-Service in a cloud-native environment. Delve into the KubeEdge architecture, Akraino KubeEdge Edge Service Blueprint, and ML offloading functional block diagram. Examine use cases for device app ML model inference offloading workflows and address edge AI challenges. Learn about the KubeEdge-AIService architecture and edge-cloud collaborative techniques such as joint inference, incremental learning, and federated learning. Gain insights into developer perspectives on joint inference and federated learning, as well as resource information for building robust edge AI solutions.
Syllabus
Intro
Problem and Challenges
Background
KubeEdge Architecture
Akraino KubeEdge Edge Service Blueprint
KubeEdge ML Offloading Functional Block Diagram
Use Case: Device App ML model inference offloading workflow
Edge Al Challenges
KubeEdge-AI
Service Architecture
Edge-cloud Collaborative JOINT INFERENCE Improve the inference performance, when edge resources are imited
Edge-cloud Collaborative INCREMENTAL LEARNING The more models are used the smarter they are
Edge-cloud collaborative FEDERATED LEARNING Raw data is not transmitted out of the edge, and the model is generated by
Developer perspective: JOINT INFERENCE
Developer perspective: FEDERATED LEARNING
Resource Information
Taught by
Linux Foundation
Tags
Related Courses
Secure and Private AIFacebook via Udacity Advanced Deployment Scenarios with TensorFlow
DeepLearning.AI via Coursera Big Data for Reliability and Security
Purdue University via edX MLOps for Scaling TinyML
Harvard University via edX Edge Analytics: IoT and Data Science
LinkedIn Learning