• 中国精品科技期刊
  • CCF推荐A类中文期刊
  • 计算领域高质量科技期刊T1类
Advanced Search
Lin Jingjing, Ye Zhonglin, Zhao Haixing, Li Zhuoran. Survey on Hypergraph Neural Networks[J]. Journal of Computer Research and Development, 2024, 61(2): 362-384. DOI: 10.7544/issn1000-1239.202220483
Citation: Lin Jingjing, Ye Zhonglin, Zhao Haixing, Li Zhuoran. Survey on Hypergraph Neural Networks[J]. Journal of Computer Research and Development, 2024, 61(2): 362-384. DOI: 10.7544/issn1000-1239.202220483

Survey on Hypergraph Neural Networks

Funds: This work was supported by the National Key Research and Development Program of China (2020YFC1523300), the Youth Program of Natural Science Foundation of Qinghai Province (2021-ZJ-946Q), and the Middle-Youth Program of Natural Science Foundation of Qinghai Normal University (2020QZR007).
More Information
  • Author Bio:

    Lin Jingjing: bron in 1986. PhD candidate, lecturer. Her main research interests include graph neural networks and hypergraph neural networks

    Ye Zhonglin: born in 1989. PhD, associate professorr, PhD supervisor. Member of CCF. His main research interests include graph neural networks, knowledge extraction, and network representation learning

    Zhao Haixing: born in 1969. PhD, professor, PhD supervisor. Member of CCF. His main research interests include complex network, graph neural networks, machine translation, hypergraph theory, and network reliability

    Li Zhuoran: born in 1996. Master. His main research interests include data mining and graph neural networks

  • Received Date: June 09, 2022
  • Revised Date: April 17, 2023
  • Available Online: November 13, 2023
  • In recent years, graph neural networks have achieved remarkable results in application fields such as recommendation systems and natural language processing with the help of large amounts of data and supercomputing power, and they mainly deal with graph data with pairwise relationships. However, in many real-world networks, the relationships between objects are more complex and beyond pairwise, such as scientific collaboration networks, protein networks, and others. If we directly use a graph to represent the complex relationships as pairwise relations, which will lead to a loss of information. Hypergraph is a flexible modeling tool, which shows higher-order relationships that cannot be fully described by a graph, making up for the shortage of graph. In light of this, scholars begin to care about how to develop neural networks on hypergraph, and successively put forward many hypergraph neural network models. Therefore, we overview the existing hypergraph neural network models. Firstly, we comprehensively review the development of the hypergraph neural network in the past three years. Secondly, we propose a new classification method according to the design method of hypergraph neural networks, and elaborate on representative models. Then, we introduce the application areas of hypergraph neural networks. Finally, the future research direction of hypergraph neural networks are summarized and discussed.

  • [1]
    Krizhevsky A, Sutskever I, Hinton G E. ImageNet classification with deep convolutional neural networks[J]. Communications of the ACM, 2012, 60(6): 84−90
    [2]
    Elman J L. Distributed representations, simple recurrent networks, and grammatical structure[J]. Machine Learning, 1991, 7(2): 195−225.
    [3]
    LeCun Y, Bengio Y, Hinton G. Deep learning[J]. Nature, 2015, 521(7553): 436−444 doi: 10.1038/nature14539
    [4]
    Hochreiter S, Schmidhuber J. Long short-term memory[J]. Neural Computation, 1997, 9(8): 1735−1780 doi: 10.1162/neco.1997.9.8.1735
    [5]
    Chung J, Gülçehre Ç, Cho K, et al. Empirical evaluation of gated recurrent neural networks on sequence modeling[J]. arXiv preprint, arXiv: 1412. 3555, 2014
    [6]
    徐冰冰,岑科廷,黄俊杰,等. 图卷积神经网络综述[J]. 计算机学报,2020,43(5):755−780 doi: 10.11897/SP.J.1016.2020.00755

    Xu Bingbing, Cen Keting, Huang Junjie, et al. A survey on graph convolutional neural network[J]. Chinese Journal of Computers, 2020, 43(5): 755−780(in Chinese) doi: 10.11897/SP.J.1016.2020.00755
    [7]
    Bruna J, Zaremba W, Szlam A, et al. Spectral networks and locally connected networks on graphs[J]. arXiv preprint, arXiv: 1312. 6203, 2014
    [8]
    马帅,刘建伟,左信. 图神经网络综述[J]. 计算机研究与发展,2022,59(1):47−80 doi: 10.7544/issn1000-1239.20201055

    Ma Shuai, Liu Jianwei, Zuo Xin. Survey on graph neural network[J]. Journal of Computer Research and Development, 2022, 59(1): 47−80(in Chinese) doi: 10.7544/issn1000-1239.20201055
    [9]
    李涵,严明玉,吕征阳,等. 图神经网络加速结构综述[J]. 计算机研究与发展,2021,58(6):1204−1229 doi: 10.7544/issn1000-1239.2021.20210166

    Li Han, Yan Mingyu, Lü Zhengyang, et al. Survey on graph neural network acceleration architectures[J]. Journal of Computer Research and Development, 2021, 58(6): 1204−1229(in Chinese) doi: 10.7544/issn1000-1239.2021.20210166
    [10]
    Velickovic P, Cucurull G, Casanova A, et al. Graph attention networks[J]. arXiv preprint, arXiv: 1710. 10903, 2017
    [11]
    Ye Zhonglin, Zhao Haixing, Zhu Yu, et al. HSNR: A network representation learning algorithm using hierarchical structure embedding[J]. Chinese Journal of Electronics, 2019, 29(6): 1141−1152
    [12]
    Kipf T N, Welling M. Variational graph auto-encoders[J]. arXiv preprint, arXiv: 1611. 07308, 2016
    [13]
    Zhang Muhan, Chen Yixin. Link prediction based on graph neural networks [C] //Proc of the 32nd Int Conf on Neural Information Processing Systems. Red Hook: Curran Associates Inc, 2018: 5171–5181
    [14]
    Ye Zhonglin, Zhao Haixing, Zhang Kecheng, et al. Tri-party deep network representation learning using inductive matrix completion[J]. Journal of Central South University, 2019, 26(10): 2746−2758 doi: 10.1007/s11771-019-4210-8
    [15]
    Hamaguchi T, Oiwa H, Shimbo M, et al. Knowledge transfer for out-of-knowledge-base entities: A graph neural network approach[J]. arXiv preprint, arXiv: 1706. 05674, 2017
    [16]
    Marcheggiani D, Bastings J, Titov I. Exploiting semantics in neural machine translation with graph convolutional networks[C] //Proc of the 16th Annual Conf of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Stroudsburg, PA: ACL, 2018: 486−492
    [17]
    Wu Zonghan, Pan Shirui, Chen Fengwen, et al. A comprehensive survey on graph neural networks[J]. IEEE Transactions on Neural Networks and Learning Systems, 2019, 32(1): 4−24
    [18]
    Han Yi, Zhou Bin, Pei Jian, et al. Understanding importance of collaborations in co-authorship networks: A supportiveness analysis approach [C] //Proc of the 9th SIAM Int Conf on Data Mining. Philadelphia, PA: SIAM, 2009: 1112−1123
    [19]
    Feng Yifan, You Haoxuan, Zhang Zizhao, et al. Hypergraph neural networks [C] //Proc of the 33rd AAAI Conf on Artificial Intelligence. Palo Alto, CA: AAAI, 2019: 3558−3565
    [20]
    Yadati N, Nimishakavi M, Yadav P, et al. HyperGCN: A new method of training graph convolutional networks on hypergraphs [C] // Proc of the 33rd Neural Information Processing Systems. New York: Curran Associates Inc, 2019: 1511−1522
    [21]
    Chen Chaofan, Cheng Zelei, Li Zuotian, et al. Hypergraph attention networks[C] //Proc of the 19th Int Conf on Trust, Security and Privacy in Computing and Communications (TrustCom). Piscataway, NJ: IEEE, 2020: 1560−1565
    [22]
    Nong Liping, Wang Junyi, Lin Jiming, et al. Hypergraph wavelet neural networks for 3D object classification[J]. Neurocomputing, 2021, 463: 580−595 doi: 10.1016/j.neucom.2021.08.006
    [23]
    Wang Jianling, Ding Kaize, Hong Liangjie, et al. Next-item recommendation with sequential hypergraphs [C] //Proc of the 43rd Int ACM SIGIR Conf on Research and Development in Information Retrieval. New York: ACM, 2020: 1101−1110
    [24]
    Chen Xu, Xiong Kun, Zhang Yongfeng, et al. Neural feature-aware recommendation with signed hypergraph convolutional network[J]. ACM Transactions on Information Systems, 2020, 39(1): 1−22
    [25]
    Zhang Ruochi, Zou Yuesong, Ma Jian. Hyper-SAGNN: A self-attention based graph neural network for hypergraphs[J]. arXiv preprint, arXiv: 1911. 02613, 2019
    [26]
    Yadati N, Nitin V, Nimishakavi M, et al. NHP: Neural hypergraph link prediction [C] //Proc of the 29th ACM Int Conf on Information & Knowledge Management. New York: ACM, 2020: 1705−1714
    [27]
    Zhou Jie, Cui Ganqu, Hu Shengding, et al. Graph neural networks: A review of methods and applications[J]. AI Open, 2020, 1: 57−81 doi: 10.1016/j.aiopen.2021.01.001
    [28]
    周飞燕,金林鹏,董军. 卷积神经网络研究综述[J]. 计算机学报,2017,40(6):1229−1251 doi: 10.11897/SP.J.1016.2017.01229

    Zhou Feiyan, Jin Linpeng, Dong Jun. Review of convolutional neural network[J]. Chinese Journal of Computers, 2017, 40(6): 1229−1251(in Chinese) doi: 10.11897/SP.J.1016.2017.01229
    [29]
    Asif N A, Sarker Y, Chakrabortty P K, et al. Graph neural network: A comprehensive review on non-Euclidean space[J]. IEEE Access, 2021, 9: 60588−60606 doi: 10.1109/ACCESS.2021.3071274
    [30]
    Malekzadeh M, Hajibabaee P, Heidari M, et al. Review of graph neural network in text classification [C] //Proc of the 12th Annual Ubiquitous Computing, Electronics & Mobile Communication Conf. Piscataway, NJ: IEEE, 2021: 0084−0091
    [31]
    Wu Shiwen, Zhang Wentao, Sun Fei, et al. Graph neural networks in recommender systems: A survey[J]. arXiv preprint, arXiv: 2011. 02260, 2020
    [32]
    Gao Yue, Zhang Zizhao, Lin Haojie, et al. Hypergraph learning: Methods and practices[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2020, 44(5): 2548−2566
    [33]
    Li Yikang, Ouyang Wanli, Zhou Bolei, et al. Factorizable net: An efficient subgraph-based framework for scene graph generation[J]. arXiv preprint, arXiv: 1806. 11538, 2018
    [34]
    Yan Sijie, Xiong Yuanjun, Lin Dahua. Spatial temporal graph convolutional networks for skeleton-based action recognition[C] //Proc of the 32nd AAAI Conf on Artificial Intelligence. Palo Alto, CA : AAAI, 2018: 7444–7452
    [35]
    Wang Yue, Sun Yongbin, Liu Ziwei, et al. Dynamic graph cnn for learning on point clouds[J]. ACM Transactions on Graphics, 2019, 38(5): 1−12
    [36]
    Bastings J, Titov I, Aziz W, et al. Graph convolutional encoders for syntax-aware neural machine translation [C]// Proc of the 2017 Conf on Empirical Methods in Natural Language Processing. Stroudsburg, PA : ACL, 2017: 1957−1967
    [37]
    Beck D, Haffari G, Cohn T. Graph-to-sequence learning using gated graph neural networks[C] //Proc of the 56th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, PA: ACL, 2018: 273−283
    [38]
    Yu Ting, Yin Haoteng, Zhu Zhanxing. Spatiotemporal graph convolutional networks: A deep learning framework for traffic forecasting [C] //Proc of the 27th Int Joint Conf on Artificial Intelligence. San Francisco, CA: Morgan Kaufmann, 2018: 3634−3640
    [39]
    Guo Shengnan, Lin Youfang, Feng Ning. Attention based spatial-temporal graph convolutional networks for traffic flow forecasting [C] //Proc of the 33rd AAAI Conf on Artificial Intelligence. Palo Alto, CA : AAAI , 2019: 922−929
    [40]
    Zheng Chuanpan, Fan Xiaoliang, Wang Cheng, et al. GMAN: A graph multi-attention network for traffic prediction [C] //Proc of the 34th AAAI Conf on Artificial Intelligence. Palo Alto, CA: AAAI, 2020: 1234−1241
    [41]
    Ying Rex, He Ruining, Chen Kaifeng, et al. Graph convolutional neural networks for web-scale recommender systems[C] //Proc of the 24th ACM SIGKDD Int Conf on Knowledge Discovery & Data Mining. New York: ACM, 2018: 974−983
    [42]
    Fan Wenqi, Ma Yao, Li Qing, et al. Graph neural networks for social recommendation[C] //Proc of the 19th World Wide Web Conf. New York: ACM, 2019: 417−426
    [43]
    Wu Qitian, Zhang Hengrui, Gao Xiaofeng, et al. Dual graph attention networks for deep latent representation of multifaceted social effects in recommender systems[C] //Proc of the 19th World Wide Web. New York: ACM, 2019: 2091−2102
    [44]
    Zitnik M, Agrawal M, Leskovec J. Modeling polypharmacy side effects with graph convolutional networks[J]. Bioinformatics, 2018, 34(13): i457−i466 doi: 10.1093/bioinformatics/bty294
    [45]
    Xu Nuo, Wang Pinghui, Chen Long, et al. MR-GNN: Multi- resolution and dual graph neural network for predicting structured entity interactions[C] //Proc of the 28th Int Joint Conf on Artificial Intelligence. San Francisco, CA: Morgan Kaufmann, 2019: 3968−3974
    [46]
    Do K, Tran T, Venkatesh S. Graph transformation policy network for chemical reaction prediction [C] //Proc of the 25th ACM SIGKDD Inter Conf on Knowledge Discovery & Data Mining. New York: ACM, 2019: 750−760
    [47]
    Schlichtkrull M, Kipf T, Bloem P, et al. Modeling relational data with graph convolutional networks [C] //Proc of the European Semantic Web Conf 2018. Berlin: Springer, 2018: 593−607
    [48]
    Chao Shang, Tang Yun, Huang Jing, et al. End-to-end structure-aware convolutional networks for knowledge base completion [C] //Proc of the 33rd AAAI Conf on Artificial Intelligence. Palo Alto, CA: AAAI, 2019: 3060−3067
    [49]
    Zhang Fanjin, Liu Xinyu, Tang Jie, et al, OAG: Toward linking large-scale heterogeneous entity graphs [C]//Proc of the 25th ACM SIGKDD Int Conf on Knowledge Discovery & Data Mining. New York: ACM, 2019: 2585−2595
    [50]
    Bretto A. Hypergraph Theory [M]. Berlin: Springer, 2013: 1−42
    [51]
    Zhu Junjie, Zhao Xibin, Hu Han, et al. Emotion recognition from physiological signals using multi-hypergraph neural Networks[C] //Proc of the 2019 IEEE Int Conf on Multimedia and Expo. Piscataway, NJ: IEEE, 2019: 610−615
    [52]
    Jiang Jianwen, Wei Yuxuan, Feng Yifan, et al. Dynamic hypergraph neural networks [C] //Proc of the 28th Int Joint Conf on Artificial Intelligence. Palo Alto, CA: AAAI, 2019: 2635−2641
    [53]
    Shao Jingzhi, Zhu Junjie, Wei Yuxuan, et al. Emotion recognition by edge-weighted hypergraph neural network [C] //Proc of the 2019 IEEE Int Conf on Image Processing. Piscataway, NJ: IEEE, 2019: 2144−2148
    [54]
    Fu Sichao, Liu Weifeng, Zhou Yicong et al. HpLapGCN: Hypergraph p-Laplacian graph convolutional networks[J]. Neurocomputing, 2019, 362: 166−174 doi: 10.1016/j.neucom.2019.06.068
    [55]
    Wang Kesu, Chen Jing, Liao Shijie, et al. Geographic-semantic- temporal hypergraph convolutional network for traffic flow prediction [C] //Proc of the 25th Intl Conf on Pattern Recognition. Piscataway, NJ: IEEE, 2021: 5444−5450
    [56]
    Bandyopadhyay S, Das K, Murty M N, Line hypergraph convolution network: Applying graph convolution for hypergraphs [J]. arXiv preprint, arXiv: 2002.03392, 2020
    [57]
    Tran L H, Tran L H. Directed hypergraph neural network[J]. arXiv preprint, arXiv: 2008. 03626, 2020
    [58]
    Zhang Yubo, Wang Nan, Chen Yufeng, et al. Hypergraph label propagation network[C] // Proc of the 34th AAAI Conf on Artificial Intelligence. Palo Alto, CA: AAAI, 2020: 6885−6892
    [59]
    Yang Chaoqi, Wang Ruijie, Yao Shuochao, et al. Hypergraph learning with line expansion[J]. arXiv preprint, arXiv: 2005. 04843, 2020
    [60]
    Kim E S, Kang W Y, On K W, et al. Hypergraph attention networks for multimodal learning [C] // Proc of 2020 IEEE/CVF Conf on Computer Vision and Pattern Recognition. Piscataway, NJ: IEEE, 2020: 14569−14578
    [61]
    Dong Yihe, Sawin W, Bengio Y. HNHN: Hypergraph networks with hyperedge neurons[J]. arXiv preprint, arXiv: 2006. 12278, 2020
    [62]
    Han Jiale, Cheng Bo, Wang Xu. Two-phase hypergraph based reasoning with dynamic relations for multi-hop KBQA [C] //Proc of the 29th Int Joint Conf on Artificial Intelligence. San Francisco, CA: Morgan Kaufmann, 2021: 3615–3621
    [63]
    Te Gusi, Hu Wei, Guo Zongming, et al. Exploring hypergraph representation on face anti-spoofing beyond 2D attacks[C] //Proc of the 2020 IEEE Int Conf on Multimedia and Expo. Piscataway, NJ: IEEE , 2020[2021-02-01].https://ieeexplore.ieee.org/document/9102720/
    [64]
    Liu Shengyuan, Lv Pei, Zhang Yuzhen, et al. Semi-dynamic hypergraph neural network for 3D pose estimation [C] //Proc of the 29th Int Joint Conf on Artificial Intelligence. San Francisco, CA: Morgan Kaufmann, 2020: 782−788
    [65]
    Fatemi B, Taslakian P, Vázquez D, et al. Knowledge hypergraphs: Prediction beyond binary relations [C] //Proc of the 29th Int Joint Conf on Artificial Intelligence. San Francisco, CA: Morgan Kaufmann, 2020: 2191−2197
    [66]
    Yi J, Park J. Hypergraph convolutional recurrent neural network [C] //Proc of the 26th ACM SIGKDD Int Conf on Knowledge Discovery & Data Mining. New York: ACM, 2020: 3366−3376
    [67]
    Ji Shuyi, Feng Yifan, Ji Rongrong, et al. Dual channel hypergraph collaborative filtering [C] //Proc of the 26th ACM SIGKDD Int Conf on Knowledge Discovery & Data Mining. New York: ACM, 2020: 2020−2029
    [68]
    Lostar M, Rekik I. Deep hypergraph U-Net for brain graph embedding and classification[J]. arXiv preprint, arXiv: 2008. 13118, 2020
    [69]
    Banka A, Buzi I, Rekik I. Multi-view brain hyperconnectome autoencoder for brain state classification [C] // Proc of the 3rd Int Workshop of Predictive Intelligence in Medicine. Berlin: Springer, 2020: 101−110
    [70]
    Yadati N. Neural message passing for multi-relational ordered and recursive hypergraphs [C]//Proc of the 34th Int Conf on Neural Information Processing Systems. Red Hook, NY: Curran Associates Inc, 2020: 3275–3289
    [71]
    Wu Xiangping, Chen Qingcai, Li Wei, et al. AdaHGNN: Adaptive hypergraph neural networks for multi-Label image classification [C] //Proc of the 28th ACM Inter Conf on Multimedia. New York: ACM, 2020: 284−293
    [72]
    Sun Xiangguo, Yin Hongzhi, Liu Bo, et al. Heterogeneous hypergraph embedding for graph classification [C] //Proc of the 14th ACM Int Conf on Web Search and Data Mining. New York: ACM, 2021: 725−733
    [73]
    Shao Shuai, Xu Rui, Wang Yanjiang, et al. SAHDL: Sparse attention hypergraph regularized dictionary learning[J]. arXiv preprint, arXiv: 2010.12416, 2020
    [74]
    Arya D, Gupta D K, Rudinac S, et al. HyperSAGE: Generalizing inductive representation learning on hypergraphs[J]. arXiv preprint, arXiv: 2010. 04558, 2020
    [75]
    Ding Kaize, Wang Jianling, Li Jundong, et al. Be more with less: Hypergraph attention networks for inductive text classification [C] //Proc of the 2020 Conf on Empirical Methods in Natural Language Processing. Stroudsburg, PA: ACL, 2020: 4927−4936
    [76]
    Sawhney R, Agarwal S, Wadhwa A, et al. Spatiotemporal hypergraph convolution network for stock movement forecasting [C] //Proc of the 2020 IEEE Int Conf on Data Mining. Piscataway, NJ: IEEE, 2020: 482−491
    [77]
    Tan Yaqi, Ma Zhongchen, Zhan Yongzhao, et al. Hypergraph induced graph convolutional network for multi-label image recognition[C/OL] //Proc of the 2020 Int Conf on Internet of Things and Intelligent Applications. Piscataway, NJ: IEEE, 2020[2021-03-01].https://ieeexplore.ieee.org/document/9312371
    [78]
    Hou Rui, Small M, Forrest A R R. Community detection in a weighted directed hypergraph representation of cell-to-cell communication networks [J/OL]. bioRxiv, 2020[2021-03-01].https://doi.org/10.1101/ 2020.11.16.381566
    [79]
    Bandyopadhyay S, Das K, Murty M N, Hypergraph attention isomorphism network by learning line graph expansion [C] //Proc of the 2020 IEEE Int Conf on Big Data. Piscataway, NJ: IEEE, 2020: 669−678
    [80]
    Garasuie M M, Shabankhah M, Kamandi A. Improving hypergraph attention and hypergraph convolution networks [C] //Proc of the 11th Int Conf on Information and Knowledge Technology. Piscataway, NJ: IEEE, 2020: 67−72
    [81]
    Xia Xin, Yin Hongzhi, Yu Junliang, et al. Self-supervised hypergraph convolutional networks for session-based recommendation [C]// Proc of the 35th AAAI Conf on Artificial Intelligence. Palo Alto, CA: AAAI, 2021: 4503−4511
    [82]
    Sawhney R, Agarwal S, Wadhwa A, et al. Stock selection via spatiotemporal hypergraph attention network: A learning to rank approach [C] //Proc of the 35th AAAI Conf on Artificial Intelligence. Palo Alto, CA: AAAI, 2021: 497−504
    [83]
    Yu Junliang, Yin Hongzhi, Li Jundong, et al. Self-supervised multi-channel hypergraph convolutional network for social recommendation [C] //Proc of the 21st World Wide Web. New York: ACM, 2021: 413−424
    [84]
    Hao Xiaoke, Li Jie, Yingchun, et al. Hypergraph neural network for skeleton-based action recognition[J]. IEEE Transactions on Image Processing, 2021, 30: 2263−2275 doi: 10.1109/TIP.2021.3051495
    [85]
    Xue Hansheng, Yang Luwei, Rajan V, et al. Multiplex bipartite network embedding using dual hypergraph convolutional networks [C] //Proc of the 21st World Wide Web. New York: ACM, 2021: 1649−1660
    [86]
    Maleki S, Wall D P, Pingali K. NetVec: A dcalable hypergraph embedding system[J]. arXiv preprint, arXiv: 2103. 09660v1, 2021
    [87]
    Tudisco F, Prokopchik K, Benson A R. A nonlinear diffusion method for semi-supervised learning on hypergraphs[J]. arXiv preprint, arXiv: 2103. 14867, 2021
    [88]
    Wang Jingcheng, Zhang Yong, Wei Yun, et al. Metro passenger flow prediction via dynamic hypergraph convolution networks[J]. IEEE Transactions on Intelligent Transportation Systems, 2021, 22(12): 7891−7903 doi: 10.1109/TITS.2021.3072743
    [89]
    Liu Binghao, Zhao Pengpeng, Zhuang Fuzhen, et al. Knowledge-aware hypergraph neural network for recommender systems[C] //Proc of the 26th Int Conf on Database Systems for Advanced Applications. Berlin: Springer, 2021: 132−147
    [90]
    Wang Jianling, Ding Kaize, Zhu Ziwei, et al. Session-based recommendation with hypergraph attention networks[J]. arXiv preprint, arXiv: 2112. 14266, 2021
    [91]
    Bai Junjie, Gong Biao, Zhao Yining, et al. Multi-scale representation learning on hypergraph for 3D shape retrieval and recognition[J]. IEEE Transactions on Image Processing, 2021, 30: 5327−5338 doi: 10.1109/TIP.2021.3082765
    [92]
    Du Boxin, Yuan Changhe, Robert B, et al. Hypergraph pre-training with graph neural networks[J]. arXiv preprint, arXiv: 2105. 10862, 2021
    [93]
    Huang Jing, Huang Xiaolin, Yang Jie. Residual enhanced multi- hypergraph neural network [C] // Proc of the 2021 IEEE Int Conf on Image Processing. Piscataway, NJ: IEEE, 2021: 3657−3661
    [94]
    Fu Jun, Hou Chengbin, Zhou Wei, et al. Adaptive hypergraph convolutional network for no-reference 360-degree image quality assessment [C] //Proc of the 30th ACM Inte Conf on Multimedia. New York: ACM, 2021: 961–969
    [95]
    Huang Jing, Yang Jie. UniGNN: A unified framework for graph and hypergraph neural networks [C] //Proc of the 30th Int Joint Conf on Artificial Intelligence. San Francisco, CA: Morgan Kaufmann, 2021: 2563−2569
    [96]
    Jo J, Baek Ji, Lee S, et al. Edge representation learning with hypergraphs[J]. arXiv preprint, arXiv: 2106. 15845, 2021
    [97]
    Zhang Jiying, Chen Yuzhao, Xiao Xiong, et al. Learnable hypergraph Laplacian for hypergraph learning [C] //Proc of the 2022 IEEE Int Conf on Acoustics, Speech and Signal Processing. Piscataway, NJ: IEEE, 2022: 4503−4507
    [98]
    Yan Yichao, Qin Jie, Chen Jiaxin, et al. Learning multi-granular hypergraphs for video-based person re-identification [C] //Proc of the 2020 IEEE/CVF Conf on Computer Vision and Pattern Recognition. Piscataway, NJ: IEEE, 2020: 2896−2905
    [99]
    Chien E, Pan Chao, Peng Jianhao, et al. You are AllSet: A multiset function framework for hypergraph neural networks[J]. arXiv preprint, arXiv: 2106. 13264, 2021
    [100]
    Cui Chaoran, Li Xiaojie, Du Juan, et al. Temporal-relational hypergraph tri-attention networks for stock trend prediction[J]. arXiv preprint, arXiv: 2107. 14033, 2021
    [101]
    Pan Junren, Lei Baiying, Shen Yanyan, et al. Characterization multimodal connectivity of brain network by hypergraph GAN for Alzheimer’s disease analysis [C] //Proc of the 4th Chinese Conf on Pattern Recognition and Computer Vision. Berlin: Springer, 2021: 467−478
    [102]
    Zuo Qinkun, Lei Baiying, Shen Yanyan, et al. Multimodal representations learning and adversarial hypergraph fusion for early Alzheimer’s disease prediction [C] //Proc of the 4th Chinese Conf on Pattern Recognition and Computer Vision. Berlin: Springer, 2021: 479− 490
    [103]
    Li Yichao, Chen Hongxu, Sun Xiangguo, et al. Hyperbolic hypergraphs for sequential recommendation [C] //Proc of the 30th ACM Int Conf on Information & Knowledge Management. New York: ACM, 2021: 988−997
    [104]
    Yadati N, Gao T, Asoodeh S, et al. Graph neural networks for soft semi-supervised learning on hypergraphs [C] //Proc of the 25th Pacific-Asia Conf on Knowledge Discovery and Data Mining. Berlin: Springer, 2021: 447−458
    [105]
    Ma Zhongtian, Jiang Zhiguo, Zhang Haopeng. Hyperspectral image classification using spectral-spatial hypergraph convolution neural network[J]. SPIE Remote Sensing, 2021, 11862: 118620I
    [106]
    Wu Longcan, Wang Daling, Song Kaisong, et al. Dual-view hypergraph neural networks for attributed graph learning[J]. Knowledge-Based Systems, 2021, 227: 107185 doi: 10.1016/j.knosys.2021.107185
    [107]
    Zhang Junwei, Gao Min, Yu Junliang, et al. Double-scale self- supervised hypergraph learning for group recommendation[C] //Proc of the 30th ACM Int Conf on Information & Knowledge Management. New York: ACM, 2021: 2557−2567
    [108]
    Vijaikumar M, Hada D V, Shevade S K. HyperTeNet: Hypergraph and transformer-based neural network for personalized list continuation [C] //Proc of the 2021 IEEE Int Conf on Data Mining. Piscataway, NJ: IEEE 2021: 1210−1215
    [109]
    Xia Zhongxiu, Zhang Weiyu, Weng Ziqiang. Social recommendation system based on hypergraph attention network[J]. Computational Intelligence and Neuroscience, 2021, 2021: 7716214
    [110]
    Ding Meirong, Lin Xiaokang, Zeng Biqing, et al. Hypergraph neural networks with attention mechanism for session-based recommendation[J]. Journal of Physics: Conference Series, 2021, 2082(1): 012007 doi: 10.1088/1742-6596/2082/1/012007
    [111]
    Zhu Zirui, Gao Chen, Chen Xu, et al. Inhomogeneous social recommendation with hypergraph convolutional networks[J]. arXiv preprint, arXiv: 2111. 03344, 2021
    [112]
    Luo Xiaoyi, Peng Jiaheng, Liang Jun. Directed hypergraph attention network for traffic forecasting[J]. IET Intelligent Transport Systems, 2022, 16(1): 85−98 doi: 10.1049/itr2.12130
    [113]
    Lin Jingjing, Ye Zhonglin, Zhao Haixing, et al. DeepHGNN: A novel deep hypergraph neural network[J]. Chinese Journal of Electronics, 2022, 31(5): 958−968 doi: 10.1049/cje.2021.00.108
    [114]
    Kipf T N, Welling M. Semi-supervised classification with graph convolutional networks [J]. arXiv preprint, arXiv:1609. 02907, 2017
    [115]
    Bruna J, Zaremba W, Szlam A, et al. Spectral networks and locally connected networks on graphs[J]. arXiv preprint, arXiv: 1312. 6203, 2014
    [116]
    Defferrard M, Bresson X, Vandergheynst P. Convolutional neural networks on graphs with fast localized spectral filtering[C] //Proc of the 30th Int Conf on Neural Information Processing Systems. Red Hook, NY: Curran Associates Inc, 2016: 3844−3852
    [117]
    Zhou Dengyong, Huang Jiayuan, Schölkopf B. Learning with hypergraphs: Clustering, classification, and embedding [C] // Proc of the 19th Int Conf on Neural Information Processing Systems. Cambridge, MA: MIT, 2006: 1601–1608
    [118]
    Song Bai, Zhang Feihu, Torr P H S. Hypergraph convolution and hypergraph attention[J]. Pattern Recognition, 2021, 110: 107637 doi: 10.1016/j.patcog.2020.107637
    [119]
    Ma Xueqi, Liu Weifeng, Li Shuying, et al. Hypergraph p-Laplacian regularization for remotely sensed image recognition[J]. IEEE Transactions on Geoscience and Remote Sensing, 2019, 57(3): 1585−1595 doi: 10.1109/TGRS.2018.2867570
    [120]
    Donnat C, Zitnik M, Hallac D, et al. Learning structural node embeddings via diffusion wavelets[J]. arXiv preprint, arXiv: 1710. 10321, 2018
    [121]
    Ioffe S, Szegedy C. Batch normalization: Accelerating deep network training by reducing internal covariate shift [C] //Proc of the 32nd Inte Conf on Machine Learning. New York: ACM, 2015: 448−456
    [122]
    Louis A. Hypergraph Markov operators, eigenvalues and approximation algorithms [C] //Proc of the 47th Annual ACM Symp on Theory of Computing. New York: ACM, 2015: 713−722
    [123]
    Chan T H H, Louis A, Tang Z G, et al. Spectral properties of hypergraph Laplacian and approximation algorithms[J]. Journal of the ACM, 2018, 65(3): 1−48
    [124]
    Chan T H H, Liang Zhibin. Generalizing the hypergraph Laplacian via a diffusion process with mediators [C] // Proc of the 24th Int Computing and Combinatorics Computing and Combinatorics. Berlin: Springer, 2020: 441–453
    [125]
    Xu Keyulu, Hu Weihua, Leskovec J, et al. How powerful are graph neural networks?[J]. arXiv preprint, arXiv: 1810. 00826, 2019
    [126]
    Sun Xiangguo, Yin Hongzhi, Liu Bo, et al. Multi-level hyperedge distillation for social linking prediction on sparsely observed networks[C] //Proc of the 21st World Wide Web Conf. New York: ACM, 2021: 2934−2945
    [127]
    Xia Liqiao, Zheng Pai, Huang Xiao, et al. A novel hypergraph convolution network-based approach for predicting the material removal rate in chemical mechanical planarization[J]. Journal of Intelligent Manufacturing, 2021, 33(8): 2295−2306
    [128]
    Alfke D, Stoll M. Semi-supervised classification on non-sparse graphs using low-rank graph convolutional networks[J]. arXiv preprint, arXiv: 1905. 10224v1, 2019
    [129]
    Chami I, Rex Y, Ré C, et al. Hyperbolic graph convolutional neural networks [J]. arXiv preprint, arXiv : 1910.12933, 2019
    [130]
    Srinivasan B, Zheng Da, Karypis G. Learning over families of sets-hypergraph representation learning for higher order tasks[J]. arXiv preprint, arXiv: 2101. 07773v1, 2021
    [131]
    Chen Ming, Wei Zhewei, Huang Zengfeng, et al. Simple and deep graph convolutional networks[C] //Proc of the 37th Int Conf on Machine Learning. New York: ACM, 2020: 1725−1735
    [132]
    Payne J. Deep hyperedges: A framework for transductive and inductive learning on hypergraphs[J]. arXiv preprint, arXiv: 1910. 02633, 2019
    [133]
    Gilmer J, Schoenholz S S, Riley P F, et al. Neural message passing for quantum chemistry [C] //Proc of the 34th Int Conf on Machine Learning. New York: ACM, 2017: 1263−1272
    [134]
    Hamilton W L, Ying R, Jure L. Inductive representation learning on large graphs[C] //Proc of the 31st Int Conf on Neural Information Processing Systems. Red Hook: Curran Associates Inc, 2017: 1025−1035
    [135]
    Liu Qiao, Zeng Yifu, Mokhosi R, et al. STAMP: Short-term attention/memory priority model for session-based recommendation [C] //Proc of the 24th ACM SIGKDD Int Conf on Knowledge Discovery & Data Mining. New York: ACM, 2018: 1831−1839
    [136]
    Wu Shu, Tang Yuyuan, Zhu Yanqiao, et al. Session-based recommendation with graph neural networks[C] //Proc of the 33rd AAAI Conf on Artificial Intelligence. Palo Alto, CA: AAAI, 2019: 346–353
    [137]
    He Xiangnan, Deng Kuan, Wang Xiang, et al. LightGCN: Simplifying and powering graph convolution network for recommendation [C] //Proc of the 43rd Int ACM SIGIR Conf on Research and Development in Information Retrieval. New York: ACM, 2020: 639−648
    [138]
    Wadhwa G, Dhall A, Murala S, et al. Hyperrealistic image inpainting with hypergraphs[C] //Proc of the 2021 IEEE Winter Conf on Applications of Computer Vision. Piscataway, NJ: IEEE, 2021: 3911−3920
    [139]
    Dang N T V, Tran L H, Tran L H. Noise-robust classification with hypergraph neural network[J]. arXiv preprint, arXiv: 2102. 01934, 2021
    [140]
    Zheng Wenbo, Yan Lan, Gou Chao, et al. Two heads are better than one: Hypergraph-enhanced graph reasoning for visual event ratiocination [C] //Proc of the 38th Int Conf on Machine Learning. New York: ACM, 139: 12747−12760
    [141]
    Zhao Weizhong, Zhang Jinyong, Yang Jincai, et al. A novel joint biomedical event extraction framework via two-level modeling of documents[J]. Information Sciences, 2021, 550: 27−40 doi: 10.1016/j.ins.2020.10.047
    [142]
    Vashishth S, Sanyal S, Nitin V, et al. Composition-based multi-relational graph convolutional networks[J]. arXiv preprint, arXiv: 1911. 038082, 2020
    [143]
    Yadati N, Dayanidhi R, Vaishnavi S, et al. Knowledge base question answering through recursive hypergraphs[C] //Proc of the 16th Conf of the European Chapter of the Association for Computational Linguistics. Stroudsburg, PA: ACL, 2021: 448−454
    [144]
    Han Jiale, Cheng Bo, Wang Xu. Open domain question answering based on text enhanced knowledge graph with hyperedge infusion [C] //Proc of the 2020 Findings of the Association for Computational Linguistics. Stroudsburg, PA: ACL, 2020: 1475−1481
    [145]
    Miller A, Fisch A, Dodge J, et al. Key-value memory networks for directly reading documents [C] //Proc of the 2016 Conf on Empirical Methods in Natural Language Processing. Stroudsburg, PA: ACL, 2016: 1400–1409
    [146]
    Sun Haitian, Dhingra B, Zaheer M, et al. Open domain question answering using early fusion of knowledge bases and text[J]. arXiv preprint, arXiv: 1809, 00782, 2018
    [147]
    Dongen S V. Graph clustering by flow simulation [D]. Amsterdam: Center for Math and Computer Science, 2000
    [148]
    Blondel V D, Guillaume J L, Lambiotte R, et al. Fast unfolding of communities in large networks[J]. Journal of Statistical Mechanics: Theory and Experiment, 2008, 2008(10): 10008 doi: 10.1088/1742-5468/2008/10/P10008
    [149]
    Madine M M, Rekik I, Werghi N. Diagnosing autism using T1-W MRI with multi-kernel learning and hypergraph neural network [C] //Proc of the 2020 IEEE Int Conf on Image Processing. Piscataway, NJ: IEEE, 2020: 438−442
    [150]
    Xu Keyulu, Li Chengtao, Tian Yonglong, et al. Representation learning on graphs with jumping knowledge networks[C] //Proc of the 35th Int Conf on Machine Learning. New York: ACM, 2018: 5453−5462
    [151]
    Yu Rong, Huang Wenbing, Xu Tingyang, et al. DropEdge: Towards deep graph convolutional networks on node classification[J]. arXiv preprint, arXiv: 1907. 10903, 2020
  • Related Articles

    [1]Zhou Yuanding, Gao Guopeng, Fang Yaodong, Qin Chuan. Perceptual Authentication Hashing with Image Feature Fusion Based on Window Self-Attention[J]. Journal of Computer Research and Development. DOI: 10.7544/issn1000-1239.202330669
    [2]Gao Wei, Chen Liqun, Tang Chunming, Zhang Guoyan, Li Fei. One-Time Chameleon Hash Function and Its Application in Redactable Blockchain[J]. Journal of Computer Research and Development, 2021, 58(10): 2310-2318. DOI: 10.7544/issn1000-1239.2021.20210653
    [3]Wu Linyang, Luo Rong, Guo Xueting, Guo Qi. Partitioning Acceleration Between CPU and DRAM: A Case Study on Accelerating Hash Joins in the Big Data Era[J]. Journal of Computer Research and Development, 2018, 55(2): 289-304. DOI: 10.7544/issn1000-1239.2018.20170842
    [4]Jiang Jie, Yang Tong, Zhang Mengyu, Dai Yafei, Huang Liang, Zheng Lianqing. DCuckoo: An Efficient Hash Table with On-Chip Summary[J]. Journal of Computer Research and Development, 2017, 54(11): 2508-2515. DOI: 10.7544/issn1000-1239.2017.20160795
    [5]Wang Wendi, Tang Wen, Duan Bo, Zhang Chunming, Zhang Peiheng, Sun Ninghui. Parallel Accelerator Design for High-Throughput DNA Sequence Alignment with Hash-Index[J]. Journal of Computer Research and Development, 2013, 50(11): 2463-2471.
    [6]Yuan Xinpan, Long Jun, Zhang Zuping, Luo Yueyi, Zhang Hao, and Gui Weihua. Connected Bit Minwise Hashing[J]. Journal of Computer Research and Development, 2013, 50(4): 883-890.
    [7]Qin Chuan, Chang Chin Chen, Guo Cheng. Perceptual Robust Image Hashing Scheme Based on Secret Sharing[J]. Journal of Computer Research and Development, 2012, 49(8): 1690-1698.
    [8]Ding Zhenhua, Li Jintao, Feng Bo. Research on Hash-Based RFID Security Authentication Protocol[J]. Journal of Computer Research and Development, 2009, 46(4): 583-592.
    [9]Li Zhiqiang, Chen Hanwu, Xu Baowen, Liu Wenjie. Fast Algorithms for Synthesis of Quantum Reversible Logic Circuits Based on Hash Table[J]. Journal of Computer Research and Development, 2008, 45(12): 2162-2171.
    [10]Liu Ji. One-Way Hash Function based on Integer Coupled Tent Maps and Its Performance Analysis[J]. Journal of Computer Research and Development, 2008, 45(3): 563-569.

Catalog

    Article views (1052) PDF downloads (465) Cited by()

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return