Advanced Search
    Zhang Jinyu, Ma Chenxi, Li Chao, Zhao Zhongying. Towards Lightweight Cross-Domain Sequential Recommendation via Tri-Branches Graph External Attention Network[J]. Journal of Computer Research and Development. DOI: 10.7544/issn1000-1239.202440197
    Citation: Zhang Jinyu, Ma Chenxi, Li Chao, Zhao Zhongying. Towards Lightweight Cross-Domain Sequential Recommendation via Tri-Branches Graph External Attention Network[J]. Journal of Computer Research and Development. DOI: 10.7544/issn1000-1239.202440197

    Towards Lightweight Cross-Domain Sequential Recommendation via Tri-Branches Graph External Attention Network

    • Cross-domain sequential recommendation (CSR) aims to capture the behavioral preferences of users by modeling their historical interaction sequences in multiple domains, thus providing personalized cross-domain recommendations. Recently, researchers have started integrating graph convolution networks (GCNs) into CSR to model complicated associations among users and items. However, due to their complicated structure, most graph-based CSR methods are usually accompanied by high computational complexity or memory overhead, making them difficult to deploy on resource-constrained edge devices. Besides, existing lightweight graph-based CSR methods tend to employ single layer aggregating protocol (SLAP) to propagate embeddings on cross-domain sequential graphs (CSG). Such a strategy indeed aids the GCNs in circumventing cross-domain noise interference caused by high-order neighborhood aggregation strategies. However, it also shields GCN from mining high-order sequential relationships within individual domains. To this end, we introduce a lightweight tri-branches graph external attention network (TEA-Net). Specifically, we separate the original CSG into three parts including two inner-domain sequential graphs and an inter-domain sequential graph and devise a parallel tri-branches graph convolution network to learn the node representations. This structure can simultaneously consider the first-order inter-domain correlations and the high-order inner-domain connectivity without introducing additional cross-domain noises. Besides, we propose an improved external attention (EA) component without the nonlinear channel, which captures the sequential dependency among items at a lower cost and shares attention weights across multiple branches. We conduct extensive experiments on two large-scale real-world datasets to verify the performance of TEA-Net. The experimental results demonstrate the superiority of TEA-Net in both the lightweight performance and the prediction accuracy compared with several state-of-the-art methods.
    • loading

    Catalog

      Turn off MathJax
      Article Contents

      /

      DownLoad:  Full-Size Img  PowerPoint
      Return
      Return