YoVDO

Building a Pipeline for State-of-the-Art NLP Using Hugging Face Tools

Offered By: Databricks via YouTube

Tags

Natural Language Processing (NLP) Courses Sentiment Analysis Courses Electron Courses BERT Courses Transfer Learning Courses Hugging Face Courses

Course Description

Overview

Explore the cutting-edge world of Natural Language Processing (NLP) in this 49-minute talk from Databricks. Dive into the transformative impact of transformer networks since 2017, examining models like BERT, XLNet, ALBERT, and ELECTRA. Learn how to build a comprehensive NLP pipeline using Hugging Face tools, from text tokenization with huggingface/tokenizers to generating predictions with huggingface/transformers. Discover the power of transfer learning, understand the intricacies of tokenization methods like BPE and BBPE, and gain insights into transformer architecture. Get hands-on with code examples for various NLP tasks including sentiment analysis, question answering, language modeling, and sequence classification. Master the art of defining classes, training models, and leveraging the Model Hub for state-of-the-art NLP applications.

Syllabus

Introduction
About Hugging Face
What is Transfer Learning
Transform Networks
Transfer Learning Pipeline
Tokenization
Words
BPE
Results
BBPE
Why tokenization
Pipeline
Code
Fornication
Normalizer
Transformer Architecture
Abstract Classes
Model Hub
Pipeline abstraction
Sentiment analysis
Question answering
Language modeling
Sequence classification
Defining classes
Training Models


Taught by

Databricks

Related Courses

Hugging Face on Azure - Partnership and Solutions Announcement
Microsoft via YouTube
Question Answering in Azure AI - Custom and Prebuilt Solutions - Episode 49
Microsoft via YouTube
Open Source Platforms for MLOps
Duke University via Coursera
Masked Language Modelling - Retraining BERT with Hugging Face Trainer - Coding Tutorial
rupert ai via YouTube
Masked Language Modelling with Hugging Face - Microsoft Sentence Completion - Coding Tutorial
rupert ai via YouTube