From Zero to SOTA: Fine-Tuning BERT Using Hugging Face for World-Class Training Performance
Offered By: Data Science Festival via YouTube
Course Description
Overview
Explore a comprehensive conference talk on fine-tuning BERT using Hugging Face for state-of-the-art training performance. Learn how Graphcore accelerates AI development through their Intelligence Processing Unit hardware and easy-to-integrate examples. Discover the implementation and optimization of BERT-Large for IPU systems, showcasing industry-leading performance results. Follow a step-by-step demonstration on accessing IPUs via Spell's Cloud MLOps platform, navigate through a BERT Fine-tuning notebook tutorial using the SQuADv1 dataset, and execute an inference question-answering task with the HuggingFace inference API. Gain insights into training transformer models faster using the Hugging Face Optimum toolkit, and understand the significance of BERT in industries undergoing AI transformation such as legal, banking and finance, and healthcare.
Syllabus
From Zero to SOTA: How to fine tune BERT using Hugging Face for world class training performance
Taught by
Data Science Festival
Related Courses
Sequence ModelsDeepLearning.AI via Coursera Modern Natural Language Processing in Python
Udemy Stanford Seminar - Transformers in Language: The Development of GPT Models Including GPT-3
Stanford University via YouTube Long Form Question Answering in Haystack
James Briggs via YouTube Spotify's Podcast Search Explained
James Briggs via YouTube