YoVDO

Understanding Oversmoothing in Graph Neural Networks (GNNs) - Insights from Two Theoretical Studies

Offered By: Google TechTalks via YouTube

Tags

Machine Learning Courses Theoretical Computer Science Courses Dynamical Systems Courses Attention Mechanisms Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the phenomenon of oversmoothing in Graph Neural Networks (GNNs) through two theoretical studies presented in this Google TechTalk by Xinyi Wu. Delve into the mechanisms behind oversmoothing, including adverse mixing and beneficial denoising effects, and their impact on node representations. Examine the analysis of oversmoothing in random graphs using the Contextual Stochastic Block Model (CSBM) and discover the critical depth at which oversmoothing occurs. Investigate the effects of Personalized PageRank (PPR) and initial residual connections on mitigating oversmoothing. Learn about the study of oversmoothing in attention-based GNNs, such as Graph Attention Networks (GATs) and transformers, and understand why the graph attention mechanism cannot prevent oversmoothing. Gain insights into the exponential loss of expressive power in these models and the technical framework used to analyze asymmetric, state-dependent, and time-varying aggregation operators with various nonlinear activation functions.

Syllabus

Understanding Oversmoothing in Graph Neural Networks (GNNs): Insights from Two Theoretical Studies


Taught by

Google TechTalks

Related Courses

Deep Learning for Natural Language Processing
University of Oxford via Independent
Sequence Models
DeepLearning.AI via Coursera
Deep Learning Part 1 (IITM)
Indian Institute of Technology Madras via Swayam
Deep Learning - Part 1
Indian Institute of Technology, Ropar via Swayam
Deep Learning - IIT Ropar
Indian Institute of Technology, Ropar via Swayam