Graph Convolution Network
Spectral Graph Theory Laplacian Matrix Random Walk 對稱歸一化 Graph Signal Graph Fourier Transform Inverse Graph Fourier Transform 圖濾波器 圖譜濾波 pass GCN
Spectral Graph Theory Laplacian Matrix Random Walk 對稱歸一化 Graph Signal Graph Fourier Transform Inverse Graph Fourier Transform 圖濾波器 圖譜濾波 pass GCN
The Math Behind GNN The core of a Graph Neural Network (GNN) lies in its message-passing mechanism, which allows nodes to iteratively update their representations by aggregating information from their neighbors. A general formula for the update rule of a node $v$ at layer $l+1$ can be expressed as: $$ h_v^{(l+1)} = \sigma \left( W^{(l)} \sum_{u \in \mathcal{N}(v) \cup v} \frac{h_u^{(l)}}{| \mathcal{N}(v)|} \right) $$ Let’s break down this formula: $h_v^{(l+1)}$: This is the new feature vector (or embedding) of our target node $v$ at the next layer, $l+1$. This is what we want to compute. ...