YoVDO

Towards Large Language Models as Proposal Functions in a Neuro-Symbolic Expert System - 2022

Offered By: Center for Language & Speech Processing(CLSP), JHU via YouTube

Tags

Prolog Courses Transformer Models Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a cutting-edge reasoning system that constructs structured text-based proofs of science facts grounded in an expert-verified external factbase. Delve into the NELLIE inference engine, which combines neural language modeling, guided generation, and semiparametric dense retrieval to replace handcrafted rules in a Prolog-style system. Discover how NELLIE dynamically instantiates interpretable inference rules to capture and score entailment decompositions over natural language statements. Gain insights into the system's motivation and search procedure, with a focus on how Transformer-based sequence models infuse semantics and structure from the factbase into the dynamic rule generation process, optimizing proof tree search. Based on research presented in a 2022 paper, this 51-minute talk from the Center for Language & Speech Processing at JHU offers a deep dive into the intersection of large language models and neuro-symbolic expert systems.

Syllabus

Towards Large Language Models as Proposal Functions in a Neuro-Symbolic Expert System - 2022


Taught by

Center for Language & Speech Processing(CLSP), JHU

Related Courses

Sequence Models
DeepLearning.AI via Coursera
Modern Natural Language Processing in Python
Udemy
Stanford Seminar - Transformers in Language: The Development of GPT Models Including GPT-3
Stanford University via YouTube
Long Form Question Answering in Haystack
James Briggs via YouTube
Spotify's Podcast Search Explained
James Briggs via YouTube