node-level task

Data
Node-Level tasks

Node-level tasks (sometimes called inductive learning or semi-supervised learning) have graph G as the data support (ie, it is fixed). Here, we examine each node as a sample, ie we assume that the signal and observation at node i is xi,yip(x,y). ie, each node is treated as a sample.

We assume we only observe yi for a subset of the nodes JV and want to estimate yj,jVJ

Example

Consider the contextual SBM: G=(V,E) undirected and y{1,1}n (say yi represents the community of node i). The edges are random: P(Aij)=P((i,j)E)={an,yi=yjbn,otherwise with node features/covariates
xi=unyiu+zi where uN(0,1),zN(0,In)

If u=1 then xiN(0,1n(1)2+1) and if u=0 then xiN(0,1n(1)2+1)

2025-02-03_graph-4.png

Goal: predict yi,iVJ from xiV
hypothesis class: the graph convolutions F={z=k=0K+1hkSkx,hkR}

Problem: Define a mask Mf{0,1}|J|×n. Then Mf1n=1|J| and 1|J|TMf=1nT. Then we minimize:

minhk1f(Mfy,Mfk=0K1hkSk(Mf)x)

Application: infer node's class/community/identity locally ie without needing communication, determining clustering techniques, which require eigenvectors (global graph information)

Mentions

File
readout layer
we can use GNNs to solve feature-aware semi-supervised learning problems
2025-02-03 graphs lecture 4
2025-02-05 graphs lecture 5
2025-02-12 graphs lecture 7
2025-02-19 graphs lecture 9
Improved Image Classification with Manifold Neural Networks