Training BERT - Train With Next Sentence Prediction
Offered By: James Briggs via YouTube
Course Description
Overview
Learn how to implement Next Sentence Prediction (NSP), a crucial component of BERT model training, in this 37-minute tutorial video. Explore the process of using unstructured text to pre-train BERT models for improved language understanding in specific use cases. Discover how to apply NSP alongside Masked Language Modeling (MLM) to enhance BERT's performance. Follow along with provided Jupyter Notebook and dataset to gain hands-on experience in implementing NSP for BERT pre-training. Access additional resources, including a detailed Medium article and discounted NLP course, to further expand your knowledge of BERT and transformer models in natural language processing.
Syllabus
Training BERT #4 - Train With Next Sentence Prediction (NSP)
Taught by
James Briggs
Related Courses
Introduction to Artificial IntelligenceStanford University via Udacity Natural Language Processing
Columbia University via Coursera Probabilistic Graphical Models 1: Representation
Stanford University via Coursera Computer Vision: The Fundamentals
University of California, Berkeley via Coursera Learning from Data (Introductory Machine Learning course)
California Institute of Technology via Independent