End-to-End Deep Learning for Broad Coverage Semantics - Semantic Role Labeling, Coreference Resolution, and Beyond
Offered By: Center for Language & Speech Processing(CLSP), JHU via YouTube
Course Description
Overview
Explore cutting-edge research in computational semantics through this conference talk by Luke Zettlemoyer. Delve into the application of end-to-end deep learning techniques for semantic role labeling and coreference resolution, two classic challenges in the field. Learn about the significant performance gains achieved using simple deep neural network approaches that require no preprocessing, resulting in over 20% relative error reductions compared to non-neural methods. Discover ongoing efforts to crowdsource large datasets for training these models, potentially enabling high-quality semantic analysis across various domains. Gain insights from Zettlemoyer's work with collaborators Luheng He, Kenton Lee, and Mike Lewis, and understand the potential impact of combining advanced deep learning techniques with extensive training data on the future of computational semantics.
Syllabus
End-to-End Deep Learning for Broad Coverage Semantics: SRL, Conference and Beyond - Luke Zettlemoyer
Taught by
Center for Language & Speech Processing(CLSP), JHU
Related Courses
Network Analysis in Systems BiologyIcahn School of Medicine at Mount Sinai via Coursera TechniCity
Ohio State University via Coursera Engaging Citizens: A Game Changer for Development? The World Bank
Online Learning Campus - World Bank Group via Coursera Smart Cities
The Open University via FutureLearn Social Computing
University of California, San Diego via Coursera