Building a Pipeline for State-of-the-Art NLP Using Hugging Face Tools
Offered By: Databricks via YouTube
Course Description
Overview
Explore the cutting-edge world of Natural Language Processing (NLP) in this 49-minute talk from Databricks. Dive into the transformative impact of transformer networks since 2017, examining models like BERT, XLNet, ALBERT, and ELECTRA. Learn how to build a comprehensive NLP pipeline using Hugging Face tools, from text tokenization with huggingface/tokenizers to generating predictions with huggingface/transformers. Discover the power of transfer learning, understand the intricacies of tokenization methods like BPE and BBPE, and gain insights into transformer architecture. Get hands-on with code examples for various NLP tasks including sentiment analysis, question answering, language modeling, and sequence classification. Master the art of defining classes, training models, and leveraging the Model Hub for state-of-the-art NLP applications.
Syllabus
Introduction
About Hugging Face
What is Transfer Learning
Transform Networks
Transfer Learning Pipeline
Tokenization
Words
BPE
Results
BBPE
Why tokenization
Pipeline
Code
Fornication
Normalizer
Transformer Architecture
Abstract Classes
Model Hub
Pipeline abstraction
Sentiment analysis
Question answering
Language modeling
Sequence classification
Defining classes
Training Models
Taught by
Databricks
Related Courses
Sentiment Analysis with Deep Learning using BERTCoursera Project Network via Coursera Natural Language Processing with Attention Models
DeepLearning.AI via Coursera Fine Tune BERT for Text Classification with TensorFlow
Coursera Project Network via Coursera Deploy a BERT question answering bot on Django
Coursera Project Network via Coursera Generating discrete sequences: language and music
Ural Federal University via edX