From Zero to SOTA: Fine-Tuning BERT Using Hugging Face for World-Class Training Performance
Offered By: Data Science Festival via YouTube
Course Description
Overview
Explore a comprehensive conference talk on fine-tuning BERT using Hugging Face for state-of-the-art training performance. Learn how Graphcore accelerates AI development through their Intelligence Processing Unit hardware and easy-to-integrate examples. Discover the implementation and optimization of BERT-Large for IPU systems, showcasing industry-leading performance results. Follow a step-by-step demonstration on accessing IPUs via Spell's Cloud MLOps platform, navigate through a BERT Fine-tuning notebook tutorial using the SQuADv1 dataset, and execute an inference question-answering task with the HuggingFace inference API. Gain insights into training transformer models faster using the Hugging Face Optimum toolkit, and understand the significance of BERT in industries undergoing AI transformation such as legal, banking and finance, and healthcare.
Syllabus
From Zero to SOTA: How to fine tune BERT using Hugging Face for world class training performance
Taught by
Data Science Festival
Related Courses
Sentiment Analysis with Deep Learning using BERTCoursera Project Network via Coursera Natural Language Processing with Attention Models
DeepLearning.AI via Coursera Fine Tune BERT for Text Classification with TensorFlow
Coursera Project Network via Coursera Deploy a BERT question answering bot on Django
Coursera Project Network via Coursera Generating discrete sequences: language and music
Ural Federal University via edX