YoVDO

CPU Inference with ViT ONNX Model in Azure ML Managed Endpoint - AKS

Offered By: The Machine Learning Engineer via YouTube

Tags

Azure Machine Learning Courses Machine Learning Courses MLOps Courses Inference Courses ONNX Courses Vision Transformers Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Learn how to perform CPU inference on Azure Kubernetes Service (AKS) by creating a Managed Endpoint in Azure Machine Learning Studio. Explore the process of converting a Vision Transformer (ViT) model to ONNX format and utilizing onnxruntime with Python Azure ML SDK v2. This 49-minute video tutorial guides you through the steps of setting up and deploying a machine learning model for efficient inference in a cloud environment, demonstrating essential MLOps practices for data scientists and machine learning engineers.

Syllabus

MLOPS: CPU Inference ViT ONNX Model in Azure ML Managed EndPoint (AKS )#machinelearning #datascience


Taught by

The Machine Learning Engineer

Related Courses

Vision Transformers Explained + Fine-Tuning in Python
James Briggs via YouTube
ConvNeXt- A ConvNet for the 2020s - Paper Explained
Aleksa Gordić - The AI Epiphany via YouTube
Do Vision Transformers See Like Convolutional Neural Networks - Paper Explained
Aleksa Gordić - The AI Epiphany via YouTube
Stable Diffusion and Friends - High-Resolution Image Synthesis via Two-Stage Generative Models
HuggingFace via YouTube
Intro to Dense Vectors for NLP and Vision
James Briggs via YouTube