LLMOps: Intel OpenVino Toolkit Inference on CPU and GPU for Transformers
Offered By: The Machine Learning Engineer via YouTube
Course Description
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Learn how to leverage the Intel OpenVino toolkit for efficient inference of Transformer models on both CPU and GPU. This 28-minute video tutorial guides you through the process of installing the Intel OpenVino Runtime, converting a Transformer model, and performing inference on CPU and GPU. Gain practical insights into optimizing machine learning workflows for data science applications. Access the accompanying notebook on GitHub for hands-on experience with the demonstrated techniques.
Syllabus
LLMOps: Intel OpenVino toolkit Inference CPU and GPU Transformers #datascience #machinelearning
Taught by
The Machine Learning Engineer
Related Courses
4.0 Shades of Digitalisation for the Chemical and Process IndustriesUniversity of Padova via FutureLearn A Day in the Life of a Data Engineer
Amazon Web Services via AWS Skill Builder FinTech for Finance and Business Leaders
ACCA via edX Accounting Data Analytics
University of Illinois at Urbana-Champaign via Coursera Accounting Data Analytics
Coursera