From Zero to SOTA: Fine-Tuning BERT Using Hugging Face for World-Class Training Performance
Offered By: Data Science Festival via YouTube
Course Description
Overview
Explore a comprehensive conference talk on fine-tuning BERT using Hugging Face for state-of-the-art training performance. Learn how Graphcore accelerates AI development through their Intelligence Processing Unit hardware and easy-to-integrate examples. Discover the implementation and optimization of BERT-Large for IPU systems, showcasing industry-leading performance results. Follow a step-by-step demonstration on accessing IPUs via Spell's Cloud MLOps platform, navigate through a BERT Fine-tuning notebook tutorial using the SQuADv1 dataset, and execute an inference question-answering task with the HuggingFace inference API. Gain insights into training transformer models faster using the Hugging Face Optimum toolkit, and understand the significance of BERT in industries undergoing AI transformation such as legal, banking and finance, and healthcare.
Syllabus
From Zero to SOTA: How to fine tune BERT using Hugging Face for world class training performance
Taught by
Data Science Festival
Related Courses
TensorFlow: Working with NLPLinkedIn Learning Introduction to Video Editing - Video Editing Tutorials
Great Learning via YouTube HuggingFace Crash Course - Sentiment Analysis, Model Hub, Fine Tuning
Python Engineer via YouTube GPT3 and Finetuning the Core Objective Functions - A Deep Dive
David Shapiro ~ AI via YouTube How to Build a Q&A AI in Python - Open-Domain Question-Answering
James Briggs via YouTube