Citation: | Xu Lixiang, Ge Wei, Chen Enhong, Luo Bin. Graph Classification Method Based on Graph Kernel Isomorphism Network[J]. Journal of Computer Research and Development, 2024, 61(4): 903-915. DOI: 10.7544/issn1000-1239.202221004 |
Graph representation learning has become a research hotspot in the field of graph deep learning. Most graph neural networks suffer from oversmoothing, and these methods focus on graph node features and pay little attention to the structural features of graphs. In order to improve the representation of graph structural features, we propose a graph classification method based on graph kernel homomorphic network, namely KerGIN. The method first encodes the node features of the graph through graph isomorphism network(GIN), and then uses the graph kernel method to encode the graph structure. The Nyström method is further used to reduce the dimension of the graph kernel matrix. The graph kernel matrix is aligned with the graph feature matrix with the help of MLP, and the feature encoding and structure encoding of the graph are adaptively weighted and fused through the attention mechanism to obtain the final feature representation of the graph, which enhances the ability to express the structural feature information of the graph. Finally, the model is experimentally evaluated on seven publicly available graph classification datasets: compared with the existing graph representation models, KerGIN model is able to improve the graph classification accuracy substantially, and it can enhance the ability of GIN to represent the graph structural feature information.
[1] |
Peng Hao, Li Jiangxin, He Yu, et al. Large-scale hierarchical text classification with recursively regularized deep graph-CNN[C] //Proc of the 27th World Wide Web Conf. New York: ACM, 2018: 1063−1072
|
[2] |
Noble C, Cook D. Graph-based anomaly detection[C] //Proc of the 9th ACM SIGKDD Int Conf on Knowledge Discovery and Data Mining. New York: ACM, 2003: 631−636
|
[3] |
Akoglu L, Tong H, Koutra D. Graph based anomaly detection and description: A survey[J]. Data Mining and Knowledge Discovery, 2015, 29(3): 626−688 doi: 10.1007/s10618-014-0365-y
|
[4] |
谢小杰,梁英,王梓森,等. 基于图卷积的异质网络节点分类方法[J]. 计算机研究与发展,2022,59(7):1470−1485 doi: 10.7544/issn1000-1239.20210124
Xie Xiaojie, Liang Ying, Wang Zisen, et al. Heterogeneous network node classification method based on graph convolution[J]. Journal of Computer Research and Development, 2022, 59(7): 1470−1485 (in Chinese) doi: 10.7544/issn1000-1239.20210124
|
[5] |
孟绪颖,张琦佳,张瀚文,等. 社交网络链路预测的个性化隐私保护方法[J]. 计算机研究与发展,2019,56(6):1244−1251 doi: 10.7544/issn1000-1239.2019.20180306
Meng Xuying, Zhang Qijia, Zhang Hanwen, et al. Personalized privacy preserving link prediction in social networks[J]. Journal of Computer Research and Development, 2019, 56(6): 1244−1251 (in Chinese) doi: 10.7544/issn1000-1239.2019.20180306
|
[6] |
Xu Lixiang, Bai Lu, Xiao Jin, et al. Multiple graph kernel learning based on GMDH-type neural network[J]. Information Fusion, 2021, 66: 100−110 doi: 10.1016/j.inffus.2020.08.025
|
[7] |
Kipf T N, Welling M. Semi-supervised classification with graph convolutional networks[C] //Proc of the 7th Int Conf on Data Mining. Piscataway, NJ: IEEE, 2018: 875−880
|
[8] |
Veličković P, Cucurull G, Casanova A, et al. Graph attention networks[J]. arXiv preprint, arXiv: 1710. 10903, 2017
|
[9] |
Hamilton W, Ying Zhitao, Leskovec J. Inductive representation learning on large graphs[J/OL]. Advances in Neural Information Processing Systems, 2017[2022-11-25].https://proceedings.neurips.cc/paper/6703
|
[10] |
Gao Hongyang, Ji Shuiwang. Graph U-Nets[C] // Proc of the 36th Int Conf on Machine Learning. Cambridge, MA: MIT, 2019: 2083−2092
|
[11] |
Chen Dexiong, Jacob L, Mairal J. Convolutional kernel networks for graph-structured data[C] //Proc of the 37th Int Conf on Machine Learning. Cambridge, MA: MIT, 2020: 1576−1586
|
[12] |
Long Qingqing, Jin Yilun, Wu Yi, et al. Theoretically improving graph neural networks via anonymous walk graph kernels[C] //Proc of the 31st World Wide Web Conf. New York: ACM, 2021: 1204−1214
|
[13] |
Feng Aosong, You Chenyu, Wang Shiqiang, et al. KerGNNs: Interpretable graph neural networks with graph kernels[C] //Proc of the 36th AAAI Conf on Artificial Intelligence. Palo Alto, CA: AAAI, 2022: 6614−6622
|
[14] |
Shervashidze N, Schweitzer P, Jan E, et al. Weisfeiler-Lehman graph kernels[J]. The Journal of Machine Learning Research, 2011, 12(3): 2539−2561
|
[15] |
Xu Keyulu, Hu Weihua, Leskovec J, et al. How powerful are graph neural networks?[C/OL] // Proc of the 7th Int Conf on Learning Representations. 2019[2022-11-25].https://openreview.net/forum?id=ryGs6iA5Km
|
[16] |
Sugiyama M, Borgwardt K. Halting in random walk kernels[J/OL]. Advances in Neural Information Processing Systems, 2015, 28[2022-11-25].https://openreview.net/pdf?id=HyZEFubd-H
|
[17] |
Borgwardt K M, Kriegel H P. Shortest-path kernels on graphs[C] // Proc of the 5th Int Conf on Data Mining. Piscataway NJ: IEEE, 2005: 8
|
[18] |
Xu Lixiang, Bai Lu, Jiang Xiaoyi, et al. Deep Rényi entropy graph kernel[J]. Pattern Recognition, 2021, 111: 107668 doi: 10.1016/j.patcog.2020.107668
|
[19] |
Prenter P M. The numerical treatment of integral equations[J]. SIAM Review, 1981, 23(2): 266−267 doi: 10.1137/1023054
|
[20] |
Williams C, Seeger M. Using the Nyström method to speed up kernel machines[J/OL]. Advances in Neural Information Processing Systems, 2000 [2022-11-25].https://proceedings.neurips.cc/paper/2000/file/19de 10adbaa1b2ee13f77f679fa1483a-Paper.pdf
|
[21] |
Debnath A K, Lopez R L, Debnath G, et al. Structure-activity relationship of mutagenic aromatic and heteroaromatic nitro compounds. Correlation with molecular orbital energies and hydrophobicity[J]. Journal of Medicinal Chemistry, 1991, 34(2): 786−797 doi: 10.1021/jm00106a046
|
[22] |
Toivonen H, Srinivasan A, King R D, et al. Statistical evaluation of the predictive toxicology challenge 2000–2001[J]. Bioinformatics, 2003, 19(10): 1183−1193 doi: 10.1093/bioinformatics/btg130
|
[23] |
Borgwardt K M, Ong C S, Schönauer S, et al. Protein function prediction via graph kernels[J]. Bioinformatics, 2005, 21(1): 47−56
|
[24] |
Wale N, Watson I A, Karypis G. Comparison of descriptor spaces for chemical compound retrieval and classification[J]. Knowledge & Information Systems, 2008, 14(3): 347−375
|
[25] |
Yanardag P, Vishwanathan S V N. Deep graph kernels[C] //Proc of the 21st ACM SIGKDD Int Conf on Knowledge Discovery and Data Mining. New York: ACM, 2015: 1365−1374
|
[26] |
Atwood J, Towsley D. Diffusion-convolutional neural networks[J/OL]. Advances in Neural Information Processing Systems, 2016[2022-11-25].https://proceedings.neurips.cc/paper/2016/file/390e982518a50e280d8e2b535462ec1f-Paper.pdf
|
[27] |
Niepert M, Ahmed M, Kutzkov K. Learning convolutional neural networks for graphs[C] // Proc of the 33rd Int Conf on Machine Learning. Cambridge, MA: MIT, 2016: 2014−2023
|
[28] |
Sun Qingyun, Li Jianxin, Peng Hao, et al. Sugar: Subgraph neural network with reinforcement pooling and self-supervised mutual information mechanism[C] //Proc of the 31st World Wide Web Conf. New York: ACM, 2021: 2081−2091
|
[29] |
Cui Lixin, Bai Lu, Bai Xiao, et al. Learning aligned vertex convolutional networks for graph classification[J/OL]. IEEE Transactions on Neural Networks and Learning Systems, 2021[2022-11-25].https://ieeexplore. ieee.org/abstract/document/9646437
|
[30] |
Zhu Yaokang, Zhang Kai, Wang Jun, et al. Structural landmarking and interaction modelling: A “SLIM” network for graph classification[C] //Proc of the 36th AAAI Conf on Artificial Intelligence. Palo Alto, CA: AAAI, 2022: 9251−9259
|
[31] |
Wijesinghe A, Wang Qing. A new perspective on "How graph neural networks go beyond Weisfeiler-Lehman?"[C/OL] // Proc of the 5th Int Conf on Learning Representations. Cambridge, MA: MIT, 2021[2022-11-25].https://openreview.net/pdf?id=uxgg9o7bI_3
|