Advanced Search
    Lin Jingjing, Ye Zhonglin, Zhao Haixing, Li Zhuoran. Survey on Hypergraph Neural Networks[J]. Journal of Computer Research and Development, 2024, 61(2): 362-384. DOI: 10.7544/issn1000-1239.202220483
    Citation: Lin Jingjing, Ye Zhonglin, Zhao Haixing, Li Zhuoran. Survey on Hypergraph Neural Networks[J]. Journal of Computer Research and Development, 2024, 61(2): 362-384. DOI: 10.7544/issn1000-1239.202220483

    Survey on Hypergraph Neural Networks

    • In recent years, graph neural networks have achieved remarkable results in application fields such as recommendation systems and natural language processing with the help of large amounts of data and supercomputing power, and they mainly deal with graph data with pairwise relationships. However, in many real-world networks, the relationships between objects are more complex and beyond pairwise, such as scientific collaboration networks, protein networks, and others. If we directly use a graph to represent the complex relationships as pairwise relations, which will lead to a loss of information. Hypergraph is a flexible modeling tool, which shows higher-order relationships that cannot be fully described by a graph, making up for the shortage of graph. In light of this, scholars begin to care about how to develop neural networks on hypergraph, and successively put forward many hypergraph neural network models. Therefore, we overview the existing hypergraph neural network models. Firstly, we comprehensively review the development of the hypergraph neural network in the past three years. Secondly, we propose a new classification method according to the design method of hypergraph neural networks, and elaborate on representative models. Then, we introduce the application areas of hypergraph neural networks. Finally, the future research direction of hypergraph neural networks are summarized and discussed.
    • loading

    Catalog

      Turn off MathJax
      Article Contents

      /

      DownLoad:  Full-Size Img  PowerPoint
      Return
      Return