YoVDO

Understanding Oversmoothing in Graph Neural Networks (GNNs) - Insights from Two Theoretical Studies

Offered By: Google TechTalks via YouTube

Tags

Machine Learning Courses Theoretical Computer Science Courses Dynamical Systems Courses Attention Mechanisms Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the phenomenon of oversmoothing in Graph Neural Networks (GNNs) through two theoretical studies presented in this Google TechTalk by Xinyi Wu. Delve into the mechanisms behind oversmoothing, including adverse mixing and beneficial denoising effects, and their impact on node representations. Examine the analysis of oversmoothing in random graphs using the Contextual Stochastic Block Model (CSBM) and discover the critical depth at which oversmoothing occurs. Investigate the effects of Personalized PageRank (PPR) and initial residual connections on mitigating oversmoothing. Learn about the study of oversmoothing in attention-based GNNs, such as Graph Attention Networks (GATs) and transformers, and understand why the graph attention mechanism cannot prevent oversmoothing. Gain insights into the exponential loss of expressive power in these models and the technical framework used to analyze asymmetric, state-dependent, and time-varying aggregation operators with various nonlinear activation functions.

Syllabus

Understanding Oversmoothing in Graph Neural Networks (GNNs): Insights from Two Theoretical Studies


Taught by

Google TechTalks

Related Courses

Automata Theory
Stanford University via edX
Intro to Theoretical Computer Science
Udacity
Computing: Art, Magic, Science
ETH Zurich via edX
理论计算机科学基础 | Introduction to Theoretical Computer Science
Peking University via edX
Quantitative Formal Modeling and Worst-Case Performance Analysis
EIT Digital via Coursera