YoVDO

HOT: Higher-Order Dynamic Graph Representation Learning with Efficient Transformers

Offered By: Scalable Parallel Computing Lab, SPCL @ ETH Zurich via YouTube

Tags

Graph Theory Courses Machine Learning Courses Neural Networks Courses Transformers Courses Attention Mechanisms Courses Dynamic Graphs Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore dynamic graph representation learning with efficient transformers in this conference talk from the Second Learning on Graphs Conference (LoG'23). Dive into the HOT model, which enhances link prediction by leveraging higher-order graph structures. Discover how k-hop neighbors and subgraphs are encoded into the attention matrix of transformers to improve accuracy. Learn about the challenges of increased memory pressure and the innovative solutions using hierarchical attention matrices. Examine the model's architecture, including encoding higher-order structures, patching, alignment, concatenation, and the block recurrent transformer. Compare HOT's performance against other dynamic graph representation learning schemes and understand its potential applications in various dynamic graph learning workloads.

Syllabus

Introduction: Link Prediction
Introduction: Higher-Order Graph Structures
Higher-Order Enhanced Pipeline
Temporal Higher-Order Structures
Formal Setting of Dynamic Link Prediction
Model Architecture: Encoding Higher-Order Structures
Model Architecture: Patching, Alignment and Concatenation
Model Architecture: Block Recurrent Transformer
Evaluation


Taught by

Scalable Parallel Computing Lab, SPCL @ ETH Zurich

Related Courses

Introduction to Artificial Intelligence
Stanford University via Udacity
Natural Language Processing
Columbia University via Coursera
Probabilistic Graphical Models 1: Representation
Stanford University via Coursera
Computer Vision: The Fundamentals
University of California, Berkeley via Coursera
Learning from Data (Introductory Machine Learning course)
California Institute of Technology via Independent