YoVDO

TensorFlow Lite for Edge Devices - Tutorial

Offered By: freeCodeCamp

Tags

TensorFlow Courses Deep Learning Courses Quantization Courses Edge Computing Courses Model Deployment Courses Model Compression Courses

Course Description

Overview

Dive into the world of TensorFlow Lite for edge devices in this comprehensive tutorial. Explore the importance of TensorFlow Lite and edge computing, understanding their growing popularity and the challenges of deploying models on edge devices. Learn the TensorFlow Lite workflow, from creating TensorFlow or Keras models to converting them to TFLite format. Discover techniques for validating model performance, understanding quantization, and compressing TFLite models for optimal efficiency. Gain hands-on experience with provided code examples and benefit from expert insights on implementing deep learning inference on resource-constrained devices.

Syllabus

) Introduction.
) Why do we need TensorFlow Lite?.
) What is Edge Computing?.
) Why is Edge Computing gaining popularity?.
) Challenges in deploying models on Edge devices.
) What is TensorFlow Lite or TFLite?.
) TensorFlow Lite Workflow.
) Creating a TensorFlow or Keras model.
) Converting a TensorFlow or Keras model to TFLite.
) Validating the TFLite model performance.
) What is Quantization?.
) Compressing the TFLite model further.
) Compressing the TFLite model even further.
) Validating the most compressed TFLite model performance.
) Thank You.


Taught by

freeCodeCamp.org

Related Courses

Few-Shot Learning in Production
HuggingFace via YouTube
TinyML Talks Germany - Neural Network Framework Using Emerging Technologies for Screening Diabetic
tinyML via YouTube
TinyML for All: Full-stack Optimization for Diverse Edge AI Platforms
tinyML via YouTube
TinyML Talks - Software-Hardware Co-design for Tiny AI Systems
tinyML via YouTube
On-Device Neural End-to-End Speech Recognition and Synthesis Algorithms - A Review
tinyML via YouTube