Neural Sheaf Diffusion: Graphs X Topology
Neural Sheaf Diffusion generalises Laplacian matrices for Graphs and can be applied on heterophilic datasets to achieve state-of-the-art performance!
It uses a concept called sheafs which are defined over topological spaces, and then letting the graph be mapped onto this topological space.
Key insights:
1) Restriction maps ---- Mapping into embedding space
2) Having more flexibility in Graph Laplacian ---- allowing for negative contributions from nodes of different classes and hence preventing mixing of "bad" information
Background:
Graph Neural Networks: • Graph Neural Networks (GNNs)
Useful Links:
AI Ephipany's explanation: • Neural Sheaf Diffusion: A Topological...
Cristian Bodnar's explanation: • AMMI 2022 Course "Geometric Deep Lear...
Neural Sheaf Diffusion Paper: https://arxiv.org/abs/2202.04579
Generalized Graph Convolutional Network (GGCN) Paper - this one has almost the same performance as Neural Sheaf Diffusion: https://arxiv.org/abs/2102.06462
Discord: https://discord.gg/fXCZCPYs
LinkedIn: https://www.linkedin.com/in/chong-min-tan-94652288/
Online AI blog: https://delvingintotech.wordpress.com/.
Twitter: https://twitter.com/johntanchongmin
Try out my games here: https://simmer.io/@chongmin
0:00 Introduction
1:21 Node classification on graphs
2:37 GNN Background
6:59 GNNs as Heat Diffusion
11:03 Heterophily and Oversmoothing
13:00 Graphs x Topology
16:18 Key Insight: Neural Sheaf Diffusion construction
23:25 Sheaf Laplacian
36:23: Key Insight: Importance of negative signals from non-similar nodes
44:14 Contrastive Learning
46:24 Link to attention and contrastive learning
50:07 Node classification by sheaf diffusion
52:00 How to learn the right sheaf?
59:50 Results
1:03:45 Discussion