graph convolutional network

[[concept]]
Graph Convolutional Networks

Introduced by Kipf and Willing, 2017, a graph convolutional network layer is given by

(x)i=σ[jN(i)(x1)j|N(i)|H]

Note that H are row vectors. We can think of each layer as a "degree normalized aggregation"
^def

Note

Advantages of GCNs:

  • One local diffusion step per layer
  • Simple and interpretable
  • Neighborhood size normalization prevents vanishing or exploding gradient problem - only one diffusion step

Disadvantages

  • no edge weights
  • only supports the adjacency S=A
  • No self loops unless they are present in A (embedding at layer not informed by the embedding at layer 1). Even if self-loops are in the graph, there is no ability to weight (x)i and (x1)j,jN(i) differently.

see also GNN

Mentions

File
GCN layers can be written as graph convolutions
graph SAGE
2025-02-17 graphs lecture 8
2025-02-24 graphs lecture 10
2025-04-16 lecture 21
2025-02-25 equivariant lecture 4