Abstract:
Graph neural networks (GNNs) have attracted extensive attention in recent years due to the powerful representation capabilities for graph-structured data. Existing GNNs mainly focus on static homogeneous graph. However, complex systems in the real world often contain multiple types of dynamically evolving entities and relationships, which are more suitable for modeling as heterogeneous temporal graphs (HTGs). Currently, HTG representation learning methods mainly focus on the semi-supervised learning paradigm, which suffers from the problems of expensive supervisory information and poor generalization. Aiming at the above problems, we propose a globally enhanced GNN for HTG based on contrastive learning. Specifically, we use a heterogeneous hierarchical attention mechanism to generate proximity-preserving node representations based on historical information. Furthermore, contrastive learning is used to maximize the mutual information between temporal local and global graph representations, enriching the global semantic information of node representations. The experimental results show that the self-supervised HTG representation learning method proposed in this paper improves the
AUC on the link prediction task of multiple real-world datasets by an average of 3.95%.