Implementing a hierarchical self-attention network
In this section, we will implement a GNN model designed to handle heterogeneous graphs – the hierarchical self-attention network (HAN). This architecture was introduced by Liu et al. in 2021 [5]. HAN uses self-attention at two different levels:
- Node-level attention to understand the importance of neighboring nodes in a given meta-path (such as a GAT in a homogeneous setting).
- Semantic-level attention to learn the importance of each meta-path. This is the main feature of HAN, allowing us to select the best meta-paths for a given task automatically – for example, the meta-path
game-user-game
might be more relevant thangame-dev-game
in some tasks, such as predicting the number of players.
In the following section, we will detail the three main components – node-level attention, semantic-level attention, and the prediction module. This architecture is illustrated in Figure 12.5.
...