TensorFlow Lite for Edge Devices - Tutorial
Offered By: freeCodeCamp
Course Description
Overview
Dive into the world of TensorFlow Lite for edge devices in this comprehensive tutorial. Explore the importance of TensorFlow Lite and edge computing, understanding their growing popularity and the challenges of deploying models on edge devices. Learn the TensorFlow Lite workflow, from creating TensorFlow or Keras models to converting them to TFLite format. Discover techniques for validating model performance, understanding quantization, and compressing TFLite models for optimal efficiency. Gain hands-on experience with provided code examples and benefit from expert insights on implementing deep learning inference on resource-constrained devices.
Syllabus
) Introduction.
) Why do we need TensorFlow Lite?.
) What is Edge Computing?.
) Why is Edge Computing gaining popularity?.
) Challenges in deploying models on Edge devices.
) What is TensorFlow Lite or TFLite?.
) TensorFlow Lite Workflow.
) Creating a TensorFlow or Keras model.
) Converting a TensorFlow or Keras model to TFLite.
) Validating the TFLite model performance.
) What is Quantization?.
) Compressing the TFLite model further.
) Compressing the TFLite model even further.
) Validating the most compressed TFLite model performance.
) Thank You.
Taught by
freeCodeCamp.org
Related Courses
DP-100 Part 3 - Deployment and Working with SDKA Cloud Guru AI in Healthcare Capstone
Stanford University via Coursera Amazon SageMaker: Build an Object Detection Model Using Images Labeled with Ground Truth (Simplified Chinese)
Amazon Web Services via AWS Skill Builder Amazon SageMaker : créez un modèle de détection d'objets à l'aide d'images étiquetées avec la vérité du terrain. (Français) | Amazon SageMaker: Build an Object Detection Model Using Images Labeled with Ground Truth (French)
Amazon Web Services via AWS Skill Builder Amazon SageMaker JumpStart で始める生成 AI (Japanese ONLY) (Na) 日本語実写版
Amazon Web Services via AWS Skill Builder