YoVDO

End-to-End Deep Learning for Broad Coverage Semantics - Semantic Role Labeling, Coreference Resolution, and Beyond

Offered By: Center for Language & Speech Processing(CLSP), JHU via YouTube

Tags

Machine Learning Courses Deep Learning Courses Supervised Learning Courses Neural Networks Courses Crowdsourcing Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore cutting-edge research in computational semantics through this conference talk by Luke Zettlemoyer. Delve into the application of end-to-end deep learning techniques for semantic role labeling and coreference resolution, two classic challenges in the field. Learn about the significant performance gains achieved using simple deep neural network approaches that require no preprocessing, resulting in over 20% relative error reductions compared to non-neural methods. Discover ongoing efforts to crowdsource large datasets for training these models, potentially enabling high-quality semantic analysis across various domains. Gain insights from Zettlemoyer's work with collaborators Luheng He, Kenton Lee, and Mike Lewis, and understand the potential impact of combining advanced deep learning techniques with extensive training data on the future of computational semantics.

Syllabus

End-to-End Deep Learning for Broad Coverage Semantics: SRL, Conference and Beyond - Luke Zettlemoyer


Taught by

Center for Language & Speech Processing(CLSP), JHU

Related Courses

Machine Learning
University of Washington via Coursera
Machine Learning
Stanford University via Coursera
Machine Learning
Georgia Institute of Technology via Udacity
Statistical Learning with R
Stanford University via edX
Machine Learning 1—Supervised Learning
Brown University via Udacity