YoVDO

MLOps: OpenVino Quantized Pipeline for Grammatical Error Correction

Offered By: The Machine Learning Engineer via YouTube

Tags

MLOps Courses Quantization Courses OpenVINO Courses RoBERTa Courses Flan-T5 Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the process of building a Grammar Error Correction (GEC) model using OpenVino quantization techniques in this 55-minute video tutorial. Learn to construct a two-component model featuring a Roberta Base error detector trained with the CoLa Dataset and a Flan-T5 large Grammar correction component fine-tuned with the JFLEG Dataset. Convert both components to IR OpenVino format and quantize the correction component for optimized performance. Gain hands-on experience in implementing CPU inference and access the accompanying notebook for practical application. Enhance your skills in MLOps, data science, and machine learning through this comprehensive demonstration of creating an efficient grammatical error correction pipeline.

Syllabus

MLOps: OpenVino Quantized Pipeline Gramatical Error Correction #datascience #machinelearning


Taught by

The Machine Learning Engineer

Related Courses

Multi-Label Classification on Unhealthy Comments - Finetuning RoBERTa with PyTorch - Coding Tutorial
rupert ai via YouTube
Hugging Face Transformers - The Basics - Practical Coding Guides - NLP Models (BERT/RoBERTa)
rupert ai via YouTube
Programming Language of the Future: AI in Your Native Language
Linux Foundation via YouTube
Pre-training and Pre-trained Models in Advanced NLP - Lecture 5
Graham Neubig via YouTube
Fine-tuning LLMs Without Maxing Out Your GPU - LoRA for Parameter-Efficient Training
Data Centric via YouTube