Hugging Face Transformers - The Basics - Practical Coding Guides - NLP Models (BERT/RoBERTa)
Offered By: rupert ai via YouTube
Course Description
Overview
Dive into the fundamentals of Hugging Face Transformers in this 30-minute video tutorial, the first episode of a practical coding guide series. Explore the basics of the Hugging Face Transformers Library, including its purpose, functionality, and applications. Navigate through high-level concepts, learn to use the Transformers documentation, and implement out-of-the-box functionality. Install the library, utilize pre-defined pipelines, and implement a model through PyTorch. Gain insights into tokenizers, token IDs, and attention masks, and understand model outputs. Perfect for those interested in Natural Language Processing (NLP) models like BERT and RoBERTa, this casual guide focuses on implementation rather than theory, providing a solid foundation for future episodes on retraining models for multi-label classification tasks.
Syllabus
Intro:
What is Hugging Face's Transformer Library:
Hugging Face models:
Navigating the Transformers documentation:
Coding with Transformers - installation:
Using pre-defined pipelines:
Implementing a model through PyTorch:
Tokenisers, Token IDs and Attention Masks:
Output from the model:
Outro:
Taught by
rupert ai
Related Courses
Getting Started with AI Powered Q&A Using Hugging Face Transformers - HuggingFace TutorialChris Hay via YouTube Build a Deep Q&A Web App with Transformers and Anvil - Python Deep Learning App
Nicholas Renotte via YouTube Build a Simple Language Translation App Using Python for Beginners
Nicholas Renotte via YouTube Automate Stocks and Crypto Research with Python and Deep Learning - Full Python Project
Nicholas Renotte via YouTube Generate Blog Posts with GPT2 and Hugging Face Transformers - AI Text Generation GPT2-Large
Nicholas Renotte via YouTube