YoVDO

Multi-Class Language Classification With BERT in TensorFlow

Offered By: James Briggs via YouTube

Tags

BERT Courses Deep Learning Courses TensorFlow Courses Transformers Courses Data Preprocessing Courses Model Training Courses

Course Description

Overview

Learn how to build a multi-class language classification model using BERT and TensorFlow in this comprehensive 43-minute tutorial. Explore the power of transformers in natural language processing as you work through each step of the process, from data preprocessing to model training and prediction. Follow along with clearly defined chapters for each section, including data input pipeline creation, model definition, and saving/loading techniques. Gain insights into the significance of transformers in deep learning and their dominance in NLP benchmarks. Utilize the HuggingFace transformers library to create an efficient and high-performing solution for multi-class text classification tasks.

Syllabus

Intro
Pulling Data
Preprocessing
Data Input Pipeline
Defining Model
Model Training
Saving and Loading Models
Making Predictions


Taught by

James Briggs

Related Courses

Sentiment Analysis with Deep Learning using BERT
Coursera Project Network via Coursera
Natural Language Processing with Attention Models
DeepLearning.AI via Coursera
Fine Tune BERT for Text Classification with TensorFlow
Coursera Project Network via Coursera
Deploy a BERT question answering bot on Django
Coursera Project Network via Coursera
Generating discrete sequences: language and music
Ural Federal University via edX