YoVDO

Sequential Recommendation with Latent Relations Based on Large Language Models - M1.6

Offered By: Association for Computing Machinery (ACM) via YouTube

Tags

Recommendation Systems Courses Machine Learning Courses Information Retrieval Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a cutting-edge conference talk on sequential recommendation systems utilizing large language models. Delve into the innovative approach presented by researchers Shenghao Yang, Weizhi Ma, Peijie Sun, Qingyao Ai, Yiqun Liu, Mingchen Cai, and Min Zhang. Learn how they leverage latent relations within large language models to enhance sequential recommendation algorithms. Gain insights into the intersection of recommender systems and natural language processing, and discover potential applications for improving user experiences in various digital platforms. This 14-minute presentation, part of the RecSys and LLMs session at SIGIR 2024, offers a concise yet comprehensive overview of this promising research direction in the field of information retrieval and recommendation systems.

Syllabus

SIGIR 2024 M1.6 [fp] Sequential Recommendation with Latent Relations based on Large Language Model


Taught by

Association for Computing Machinery (ACM)

Related Courses

Introduction to Artificial Intelligence
Stanford University via Udacity
Natural Language Processing
Columbia University via Coursera
Probabilistic Graphical Models 1: Representation
Stanford University via Coursera
Computer Vision: The Fundamentals
University of California, Berkeley via Coursera
Learning from Data (Introductory Machine Learning course)
California Institute of Technology via Independent