Hugging Face Transformers - The Basics - Practical Coding Guides - NLP Models (BERT/RoBERTa)
Offered By: rupert ai via YouTube
Course Description
Overview
Dive into the fundamentals of Hugging Face Transformers in this 30-minute video tutorial, the first episode of a practical coding guide series. Explore the basics of the Hugging Face Transformers Library, including its purpose, functionality, and applications. Navigate through high-level concepts, learn to use the Transformers documentation, and implement out-of-the-box functionality. Install the library, utilize pre-defined pipelines, and implement a model through PyTorch. Gain insights into tokenizers, token IDs, and attention masks, and understand model outputs. Perfect for those interested in Natural Language Processing (NLP) models like BERT and RoBERTa, this casual guide focuses on implementation rather than theory, providing a solid foundation for future episodes on retraining models for multi-label classification tasks.
Syllabus
Intro:
What is Hugging Face's Transformer Library:
Hugging Face models:
Navigating the Transformers documentation:
Coding with Transformers - installation:
Using pre-defined pipelines:
Implementing a model through PyTorch:
Tokenisers, Token IDs and Attention Masks:
Output from the model:
Outro:
Taught by
rupert ai
Related Courses
Sentiment Analysis with Deep Learning using BERTCoursera Project Network via Coursera Natural Language Processing with Attention Models
DeepLearning.AI via Coursera Fine Tune BERT for Text Classification with TensorFlow
Coursera Project Network via Coursera Deploy a BERT question answering bot on Django
Coursera Project Network via Coursera Generating discrete sequences: language and music
Ural Federal University via edX