YoVDO

Hugging Face Transformers - The Basics - Practical Coding Guides - NLP Models (BERT/RoBERTa)

Offered By: rupert ai via YouTube

Tags

Natural Language Processing (NLP) Courses PyTorch Courses BERT Courses Attention Mechanisms Courses Hugging Face Transformers Courses RoBERTa Courses

Course Description

Overview

Dive into the fundamentals of Hugging Face Transformers in this 30-minute video tutorial, the first episode of a practical coding guide series. Explore the basics of the Hugging Face Transformers Library, including its purpose, functionality, and applications. Navigate through high-level concepts, learn to use the Transformers documentation, and implement out-of-the-box functionality. Install the library, utilize pre-defined pipelines, and implement a model through PyTorch. Gain insights into tokenizers, token IDs, and attention masks, and understand model outputs. Perfect for those interested in Natural Language Processing (NLP) models like BERT and RoBERTa, this casual guide focuses on implementation rather than theory, providing a solid foundation for future episodes on retraining models for multi-label classification tasks.

Syllabus

Intro:
What is Hugging Face's Transformer Library:
Hugging Face models:
Navigating the Transformers documentation:
Coding with Transformers - installation:
Using pre-defined pipelines:
Implementing a model through PyTorch:
Tokenisers, Token IDs and Attention Masks:
Output from the model:
Outro:


Taught by

rupert ai

Related Courses

Deep Learning with Python and PyTorch.
IBM via edX
Introduction to Machine Learning
Duke University via Coursera
How Google does Machine Learning em Português Brasileiro
Google Cloud via Coursera
Intro to Deep Learning with PyTorch
Facebook via Udacity
Secure and Private AI
Facebook via Udacity