TAPAS - Weakly Supervised Table Parsing via Pre-training
Offered By: Yannic Kilcher via YouTube
Course Description
Overview
Explore a comprehensive explanation of the TAPAS model, which revolutionizes table parsing and question answering without relying on logical forms. Dive into the innovative approach that extends BERT's architecture to encode tables as input and utilizes weak supervision for training. Learn how TAPAS outperforms traditional semantic parsing models by improving state-of-the-art accuracy on various datasets. Discover the model's ability to select table cells and apply aggregation operators to answer complex questions about tabular information. Gain insights into the clever input encoding and loss engineering techniques that enable TAPAS to tackle diverse tables and compute answers not explicitly present in the data. Understand the benefits of transfer learning in this simplified model architecture and its potential applications in natural language processing and information retrieval.
Syllabus
TAPAS: Weakly Supervised Table Parsing via Pre-training (Paper Explained)
Taught by
Yannic Kilcher
Related Courses
Sentiment Analysis with Deep Learning using BERTCoursera Project Network via Coursera Natural Language Processing with Attention Models
DeepLearning.AI via Coursera Fine Tune BERT for Text Classification with TensorFlow
Coursera Project Network via Coursera Deploy a BERT question answering bot on Django
Coursera Project Network via Coursera Generating discrete sequences: language and music
Ural Federal University via edX