YoVDO

NLP Research Group Colloquium - Advances in Natural Language Processing

Offered By: Paul G. Allen School via YouTube

Tags

Machine Learning Courses Deep Learning Courses Text Analysis Courses Computational Linguistics Courses Sequence Modeling Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore cutting-edge research in Natural Language Processing through a series of graduate student presentations from the UW Paul G. Allen School of Computer Science & Engineering. Delve into four diverse topics: Julian Michael's approach to representing meaning with question-answer pairs, Antoine Bosselut's work on commonsense transformers for automatic knowledge graph construction, Lucy Lin's analysis of religiosity and public policy in Congress using scalable NLP methods, and Sachin Mehta's efficient machine learning techniques for visual and textual data. Gain insights into innovative NLP techniques, including semantic representation, commonsense reasoning, large-scale text analysis, and efficient deep learning models for sequence modeling. Recorded on November 7, 2019, this 51-minute colloquium offers a comprehensive overview of current NLP research trends and their applications in various domains.

Syllabus

UW Allen School Colloquium: NLP Research Group


Taught by

Paul G. Allen School

Related Courses

Applied Deep Learning: Build a Chatbot - Theory, Application
Udemy
Can Wikipedia Help Offline Reinforcement Learning? - Paper Explained
Yannic Kilcher via YouTube
Infinite Memory Transformer - Research Paper Explained
Yannic Kilcher via YouTube
Recurrent Neural Networks and Transformers
Alexander Amini via YouTube
MIT 6.S191 - Recurrent Neural Networks
Alexander Amini via YouTube