YoVDO

LLMOps: Intel OpenVino Toolkit Inference on CPU and GPU for Transformers

Offered By: The Machine Learning Engineer via YouTube

Tags

Machine Learning Courses Transformers Courses Inference Courses OpenVINO Courses LLMOps Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Learn how to leverage the Intel OpenVino toolkit for efficient inference of Transformer models on both CPU and GPU. This 28-minute video tutorial guides you through the process of installing the Intel OpenVino Runtime, converting a Transformer model, and performing inference on CPU and GPU. Gain practical insights into optimizing machine learning workflows for data science applications. Access the accompanying notebook on GitHub for hands-on experience with the demonstrated techniques.

Syllabus

LLMOps: Intel OpenVino toolkit Inference CPU and GPU Transformers #datascience #machinelearning


Taught by

The Machine Learning Engineer

Related Courses

Large Language Models: Application through Production
Databricks via edX
LLMOps - LLM Bootcamp
The Full Stack via YouTube
MLOps: Why DevOps Solutions Fall Short in the Machine Learning World
Linux Foundation via YouTube
Quick Wins Across the Enterprise with Responsible AI
Microsoft via YouTube
End-to-End AI App Development: Prompt Engineering to LLMOps
Microsoft via YouTube