In-Context Learning: How to Stop Worrying and Love Applied Information Retrieval - Lecture 1
Offered By: Association for Computing Machinery (ACM) via YouTube
Course Description
Overview
Explore a thought-provoking conference talk from SIGIR 2024 that delves into the concept of "In-Context Learning" and its implications for Applied Information Retrieval. Presented by authors Debasis Ganguly, Manish Chandra, and Andrew Parry, this 13-minute session examines the intersection of Large Language Models (LLMs) and search technologies. Gain insights into how in-context learning is reshaping the field of information retrieval and discover why embracing these advancements can lead to innovative solutions in search applications. Learn about the potential benefits and challenges of integrating LLMs with traditional search methods, and understand how this fusion is transforming the landscape of applied information retrieval.
Syllabus
SIGIR 2024 M1.1 [pp] "In-Context Learning" How to stop worrying & love Applied Information Retrieval
Taught by
Association for Computing Machinery (ACM)
Related Courses
CMU Advanced NLP: How to Use Pre-Trained ModelsGraham Neubig via YouTube Stanford Seminar 2022 - Transformer Circuits, Induction Heads, In-Context Learning
Stanford University via YouTube Pretraining Task Diversity and the Emergence of Non-Bayesian In-Context Learning for Regression
Simons Institute via YouTube In-Context Learning: A Case Study of Simple Function Classes
Simons Institute via YouTube AI Mastery: Ultimate Crash Course in Prompt Engineering for Large Language Models
Data Science Dojo via YouTube