Neurosymbolic AI: Combining Large Language Models with Symbolic Methods
Offered By: Center for Language & Speech Processing(CLSP), JHU via YouTube
Course Description
Overview
Explore the intersection of neural networks and symbolic AI methods in this 59-minute lecture by Dr. Lara J. Martin from the Center for Language & Speech Processing at JHU. Delve into neurosymbolic approaches for story generation and understanding, with the ultimate goal of creating AI capable of playing Dungeons & Dragons. Learn about the limitations of large language models like ChatGPT and discover how combining neural networks with early AI symbolic methods can lead to more robust artificial intelligence. Gain insights into applications for improving accessible communication and understand how large language models can enhance such tools. Follow Dr. Martin's journey through various AI applications, including automated story generation, augmentative and alternative communication (AAC) tools, and AI for tabletop roleplaying games.
Syllabus
Introduction
What is GBT
Story with GBT
Storytelling
Generalized Sentences
Chain of Thought Prompting
What is Dungeons and Dragons
Challenges in Dungeons and Dragons
The Venture Zone
RNN vs Large Language
What is AAC
Who uses AAC
Themes
Trans Text to Speech
Conclusion
Taught by
Center for Language & Speech Processing(CLSP), JHU
Related Courses
ChatGPT et IA : mode d'emploi pour managers et RHCNAM via France Université Numerique Generating New Recipes using GPT-2
Coursera Project Network via Coursera Deep Learning NLP: Training GPT-2 from scratch
Coursera Project Network via Coursera Data Science A-Z: Hands-On Exercises & ChatGPT Prize [2024]
Udemy Deep Learning A-Z 2024: Neural Networks, AI & ChatGPT Prize
Udemy