Understanding Oversmoothing in Graph Neural Networks (GNNs) - Insights from Two Theoretical Studies
Offered By: Google TechTalks via YouTube
Course Description
Overview
Explore the phenomenon of oversmoothing in Graph Neural Networks (GNNs) through two theoretical studies presented in this Google TechTalk by Xinyi Wu. Delve into the mechanisms behind oversmoothing, including adverse mixing and beneficial denoising effects, and their impact on node representations. Examine the analysis of oversmoothing in random graphs using the Contextual Stochastic Block Model (CSBM) and discover the critical depth at which oversmoothing occurs. Investigate the effects of Personalized PageRank (PPR) and initial residual connections on mitigating oversmoothing. Learn about the study of oversmoothing in attention-based GNNs, such as Graph Attention Networks (GATs) and transformers, and understand why the graph attention mechanism cannot prevent oversmoothing. Gain insights into the exponential loss of expressive power in these models and the technical framework used to analyze asymmetric, state-dependent, and time-varying aggregation operators with various nonlinear activation functions.
Syllabus
Understanding Oversmoothing in Graph Neural Networks (GNNs): Insights from Two Theoretical Studies
Taught by
Google TechTalks
Related Courses
Introduction to Dynamical Systems and ChaosSanta Fe Institute via Complexity Explorer Nonlinear Dynamics 1: Geometry of Chaos
Georgia Institute of Technology via Independent Linear Differential Equations
Boston University via edX Algorithmic Information Dynamics: From Networks to Cells
Santa Fe Institute via Complexity Explorer Nonlinear Differential Equations: Order and Chaos
Boston University via edX