YoVDO

Build a Transformer for Language Classification in TensorFlow

Offered By: James Briggs via YouTube

Tags

TensorFlow Courses Python Courses Sentiment Analysis Courses Data Preparation Courses Transformer Models Courses Model Training Courses

Course Description

Overview

Learn how to build a transformer model for sentiment analysis using HuggingFace's Transformers library in TensorFlow 2 with Python. Follow the complete process from data acquisition to model construction and training in this comprehensive 38-minute tutorial. Explore multi-class classification techniques combining TensorFlow and Transformers to create a multiclass sentiment classifier. Gain insights into leveraging cutting-edge transformer models like BERT for natural language processing tasks. Discover how to initialize and utilize pre-trained models, prepare data through tokenization, and implement train-validation splits. Master the art of defining and training your own transformer-based classification model, opening up new possibilities in language optimization across various industries.

Syllabus

How-to Build a Transformer for Language Classification in TensorFlow


Taught by

James Briggs

Related Courses

Sequence Models
DeepLearning.AI via Coursera
Modern Natural Language Processing in Python
Udemy
Stanford Seminar - Transformers in Language: The Development of GPT Models Including GPT-3
Stanford University via YouTube
Long Form Question Answering in Haystack
James Briggs via YouTube
Spotify's Podcast Search Explained
James Briggs via YouTube