Getting Started with AI Powered Q&A Using Hugging Face Transformers - HuggingFace Tutorial
Offered By: Chris Hay via YouTube
Course Description
Overview
Explore AI-powered Q&A using Hugging Face Transformers in this comprehensive tutorial video. Learn how to leverage pre-trained AI models for your own solutions and data. Discover Hugging Face's AI Model Hub and test Q&A functionality with custom content. Gain insights into Transfer Learning and BERT, a state-of-the-art Natural Language Processing model. Understand BERT's pre-training process using Wikipedia and BookCorpus, and its fine-tuning on the Stanford SQuAD 2.0 dataset for question-answering capabilities. Follow along as Python code is implemented in Jupyter Notebooks hosted on Google Colab, demonstrating the ease of integrating pre-trained AI models into your projects. Topics covered include an introduction to Hugging Face, using BERT models, transfer learning concepts, BERT architecture, dataset exploration, and practical coding examples using HuggingFace Pipelines and TensorFlow.
Syllabus
- Intro
- Hugging Face Model Hub
- Using a BERT model on HuggingFace
- Introduction to Transfer Learning
- Understanding BERT
- Datasets used to build pre-train BERT Wikipedia and BookCorpus
- Fine Tuning BERT to understand Q&A with SQuAD 2.0
- Coding our model with HuggingFace Pipelines using Google Colab
- Coding our model with TensorFlow using Google Colab
Taught by
Chris Hay
Related Courses
Introduction to Data Science in PythonUniversity of Michigan via Coursera Julia Scientific Programming
University of Cape Town via Coursera Python for Data Science
University of California, San Diego via edX Probability and Statistics in Data Science using Python
University of California, San Diego via edX Introduction to Python: Fundamentals
Microsoft via edX