YoVDO

How to Use Hardware Acceleration for Machine Learning Inference on Android

Offered By: Android Makers via YouTube

Tags

Machine Learning Courses Inference Courses Hardware Acceleration Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore hardware acceleration techniques for Machine Learning inference on Android devices in this 21-minute conference talk by Adrien Couque at Paris Android Makers by droidcon 2023. Discover how the increasing processing power of Android devices enables on-device machine learning capabilities previously thought impossible. Learn to leverage specialized hardware such as GPUs, DSPs, and NPUs to accelerate ML processes and enable real-time experiences like augmented reality filters and live speech transcription. Gain insights into using Acceleration delegates and new TensorFlow Lite Android APIs to implement ML hardware acceleration effectively. Delve into the world of TensorFlow Lite on Android and unlock the potential for high-performance machine learning applications on mobile devices.

Syllabus

How to use hardware acceleration for Machine Learning inference on Android - Adrien Couque


Taught by

Android Makers

Related Courses

FPGA computing systems: Partial Dynamic Reconfiguration
Politecnico di Milano via Polimi OPEN KNOWLEDGE
Introduction to Amazon Elastic Inference
Pluralsight
FPGA computing systems: Partial Dynamic Reconfiguration
Politecnico di Milano via Coursera
Introduction to Amazon Elastic Inference (Traditional Chinese)
Amazon Web Services via AWS Skill Builder
Introduction to Amazon Elastic Inference (Portuguese)
Amazon Web Services via AWS Skill Builder