Effective Sequential Monte Carlo for Language Model Probabilistic Programs
Offered By: ACM SIGPLAN via YouTube
Course Description
Overview
Explore the potential of Sequential Monte Carlo (SMC) for efficient inference in language model probabilistic programs through this conference talk from ACM SIGPLAN's LAFI'24. Delve into the LLaMPPL library, which facilitates rapid exploration of SMC algorithms for language modeling tasks and automates efficient implementation, including auto-batching of large language model calls. Gain insights into key design challenges in SMC, focusing on intermediate target and proposal distribution design. Examine three example models that demonstrate superior performance compared to state-of-the-art language models and constrained generation techniques across various tasks. Learn from the perspectives of Alexander K. Lew, Tan Zhi-Xuan, Gabriel Grand, Jacob Andreas, and Vikash K. Mansinghka as they discuss the integration of probabilistic programming with large language models to encode complex distributions beyond traditional prompting methods.
Syllabus
[LAFI'24] Effective Sequential Monte Carlo for Language Model Probabilistic Programs
Taught by
ACM SIGPLAN
Related Courses
Microsoft Bot Framework and Conversation as a PlatformMicrosoft via edX Unlocking the Power of OpenAI for Startups - Microsoft for Startups
Microsoft via YouTube Improving Customer Experiences with Speech to Text and Text to Speech
Microsoft via YouTube Stanford Seminar - Deep Learning in Speech Recognition
Stanford University via YouTube Select Topics in Python: Natural Language Processing
Codio via Coursera