TensorFlow Lite - Solution for Running ML On-Device
Offered By: TensorFlow via YouTube
Course Description
Overview
Explore TensorFlow Lite, Google's lightweight cross-platform solution for mobile and embedded devices, in this 34-minute conference talk from TF World '19. Discover how to enable on-device machine learning inference with low latency, high performance, and a small binary size. Learn about Mobile Bert, TensorFlow Lite Models, Support Library, performance techniques, Op Coverage, Selective Registration, microcontrollers, interpreter, Arduino applications, and person detection. Presented by Pete Warden and Nupur Garg, this talk covers the standard solution at Google and the primary inference framework for all on-device use-cases.
Syllabus
Intro
Slow Motion
Mobile Bert
TensorFlow Lite Models
Support Library
Performance
Techniques
Op Coverage
Selective Registration
microcontrollers
interpreter
Arduino
Applications
Person Detection
Conclusion
Taught by
TensorFlow
Related Courses
Introduction to AWS Inferentia and Amazon EC2 Inf1 InstancesPluralsight Introduction to AWS Inferentia and Amazon EC2 Inf1 Instances (Korean)
Amazon Web Services via AWS Skill Builder Introduction to Amazon Elastic Inference
Amazon Web Services via AWS Skill Builder Inference on KubeEdge
Linux Foundation via YouTube Deep Learning Neural Network Acceleration at the Edge
Linux Foundation via YouTube