YoVDO

LLMOps: Intel OpenVino Toolkit Inference on CPU and GPU for Transformers

Offered By: The Machine Learning Engineer via YouTube

Tags

Machine Learning Courses Transformers Courses Inference Courses OpenVINO Courses LLMOps Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Learn how to leverage the Intel OpenVino toolkit for efficient inference of Transformer models on both CPU and GPU. This 28-minute video tutorial guides you through the process of installing the Intel OpenVino Runtime, converting a Transformer model, and performing inference on CPU and GPU. Gain practical insights into optimizing machine learning workflows for data science applications. Access the accompanying notebook on GitHub for hands-on experience with the demonstrated techniques.

Syllabus

LLMOps: Intel OpenVino toolkit Inference CPU and GPU Transformers #datascience #machinelearning


Taught by

The Machine Learning Engineer

Related Courses

Intel® Edge AI Fundamentals with OpenVINO™
Intel via Udacity
tinyML Vision Challenge - Intel-Luxonis DepthAI Platform Overview
tinyML via YouTube
Machine Learning in Fastly's Compute@Edge
Linux Foundation via YouTube
End-to-End AI Developer Journey with Containerized Assets Using Intel DevCatalog and DevCloud
Docker via YouTube
Accelerate Your Deep Learning Inferencing with the Intel DL Boost Technology
EuroPython Conference via YouTube