Introducing the GIN
In the previous section, we saw that the GNNs introduced in the previous chapters were less expressive than the WL test. This is an issue because the ability to distinguish more graph structures seems to be connected to the quality of the resulting embeddings. In this section, we will translate the theoretical framework into a new GNN architecture – the GIN.
Introduced in 2018 by Xu et al. in a paper called “How Powerful are Graph Neural Networks?” [2], the GIN is designed to be as expressive as the WL test. The authors generalized our observations on aggregation by dividing it into two functions:
- Aggregate: This function, , selects the neighboring nodes that the GNN considers
- Combine: This function, , combines the embeddings from the selected nodes to produce the new embedding of the target node
The embedding of the node can be written as the following:
In the case of a GCN, the function aggregates...