YoVDO

Getting Started with AI Powered Q&A Using Hugging Face Transformers - HuggingFace Tutorial

Offered By: Chris Hay via YouTube

Tags

Natural Language Processing (NLP) Courses Artificial Intelligence Courses TensorFlow Courses Jupyter Notebooks Courses BERT Courses Transfer Learning Courses Hugging Face Transformers Courses

Course Description

Overview

Explore AI-powered Q&A using Hugging Face Transformers in this comprehensive tutorial video. Learn how to leverage pre-trained AI models for your own solutions and data. Discover Hugging Face's AI Model Hub and test Q&A functionality with custom content. Gain insights into Transfer Learning and BERT, a state-of-the-art Natural Language Processing model. Understand BERT's pre-training process using Wikipedia and BookCorpus, and its fine-tuning on the Stanford SQuAD 2.0 dataset for question-answering capabilities. Follow along as Python code is implemented in Jupyter Notebooks hosted on Google Colab, demonstrating the ease of integrating pre-trained AI models into your projects. Topics covered include an introduction to Hugging Face, using BERT models, transfer learning concepts, BERT architecture, dataset exploration, and practical coding examples using HuggingFace Pipelines and TensorFlow.

Syllabus

- Intro
- Hugging Face Model Hub
- Using a BERT model on HuggingFace
- Introduction to Transfer Learning
- Understanding BERT
- Datasets used to build pre-train BERT Wikipedia and BookCorpus
- Fine Tuning BERT to understand Q&A with SQuAD 2.0
- Coding our model with HuggingFace Pipelines using Google Colab
- Coding our model with TensorFlow using Google Colab


Taught by

Chris Hay

Related Courses

Creative Applications of Deep Learning with TensorFlow
Kadenze
Creative Applications of Deep Learning with TensorFlow III
Kadenze
Creative Applications of Deep Learning with TensorFlow II
Kadenze
6.S191: Introduction to Deep Learning
Massachusetts Institute of Technology via Independent
Learn TensorFlow and deep learning, without a Ph.D.
Google via Independent