Geometric Deep Learning (Part 2)
A summary with personal insights of the Geometry Deep Learning proto-book (https://geometricdeeplearning.com/) by Michael M. Bronstein, Joan Bruna, Taco Cohen, Petar Veličković
See Part 1 at:
https://www.youtube.com/watch?v=kQMbnPpoVBE
Slides can be found on:
https://github.com/tanchongmin/TensorFlow-Implementations
0:00 Recap on Graph Neural Networks
3:12 Mathematical Definition of a GNN
6:13 Mathematical Definition of 1-hop neighbourhood and features
8:22 Permutation Matrix
9:52 Permutation Invariance vs Equivariance
11:43 Generic GNN Equation
19:30 Elaboration of Permutation Equivariance
23:03 GNN Overall Architecture
27:07 GNN Architecture Explained
28:15 Three flavours of GNNs
36:10 Further elaboration of GNN feature updating process
42:00 GNNs are not very computationally efficient in current hardware
44:13 Transformers are a form of Graph Attentional Networks
48:38 Sequence Info as Positional Embedding in Transformers
51:40 Reinforcement Learning (RL) is very data hungry
52:41 Exploiting equivariance/invariance in data
55:00 Homeomorphisms in Markov Decision Process
55:25 Exploiting equivariance/invariance in data allows learning from fewer interactions
56:26 Summary and Insights!