Masked Language Modelling - Retraining BERT with Hugging Face Trainer - Coding Tutorial
Offered By: rupert ai via YouTube
Course Description
Overview
Learn how to retrain a masked language model using Hugging Face Trainer to improve performance on a specific dataset in this Python coding tutorial. Follow along as the instructor demonstrates loading the Microsoft Research Sentence Completion Challenge dataset, utilizing Hugging Face's dataset tools, implementing model training, and evaluating the results. Gain practical experience with transformer models and Hugging Face libraries while focusing on implementation rather than theory. Access the provided Colab notebook to code along and explore topics such as dataset preparation, model fine-tuning, and performance assessment.
Syllabus
Intro and outline:
Loading dataset
Hugging Face Dataset:
Hugging face model and training:
Evaluating the model:
Taught by
rupert ai
Related Courses
Sentiment Analysis with Deep Learning using BERTCoursera Project Network via Coursera Natural Language Processing with Attention Models
DeepLearning.AI via Coursera Fine Tune BERT for Text Classification with TensorFlow
Coursera Project Network via Coursera Deploy a BERT question answering bot on Django
Coursera Project Network via Coursera Generating discrete sequences: language and music
Ural Federal University via edX