Masked Language Modelling - Retraining BERT with Hugging Face Trainer - Coding Tutorial
Offered By: rupert ai via YouTube
Course Description
Overview
Learn how to retrain a masked language model using Hugging Face Trainer to improve performance on a specific dataset in this Python coding tutorial. Follow along as the instructor demonstrates loading the Microsoft Research Sentence Completion Challenge dataset, utilizing Hugging Face's dataset tools, implementing model training, and evaluating the results. Gain practical experience with transformer models and Hugging Face libraries while focusing on implementation rather than theory. Access the provided Colab notebook to code along and explore topics such as dataset preparation, model fine-tuning, and performance assessment.
Syllabus
Intro and outline:
Loading dataset
Hugging Face Dataset:
Hugging face model and training:
Evaluating the model:
Taught by
rupert ai
Related Courses
Natural Language Processing with Attention ModelsDeepLearning.AI via Coursera Large Language Models: Foundation Models from the Ground Up
Databricks via edX Artificial Intelligence in Social Media Analytics
Johns Hopkins University via Coursera Chatbots
Johns Hopkins University via Coursera Embedding Models: From Architecture to Implementation
DeepLearning.AI via Coursera