ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.21811
31
1

Revisiting Self-attention for Cross-domain Sequential Recommendation

27 May 2025
Clark Mingxuan Ju
Leonardo Neves
Bhuvesh Kumar
Liam Collins
Tong Zhao
Yuwei Qiu
Qing Dou
Sohail Nizam
Sen Yang
Neil Shah
    LRM
ArXiv (abs)PDFHTML
Main:8 Pages
7 Figures
Bibliography:3 Pages
7 Tables
Appendix:1 Pages
Abstract

Sequential recommendation is a popular paradigm in modern recommender systems. In particular, one challenging problem in this space is cross-domain sequential recommendation (CDSR), which aims to predict future behaviors given user interactions across multiple domains. Existing CDSR frameworks are mostly built on the self-attention transformer and seek to improve by explicitly injecting additional domain-specific components (e.g. domain-aware module blocks). While these additional components help, we argue they overlook the core self-attention module already present in the transformer, a naturally powerful tool to learn correlations among behaviors. In this work, we aim to improve the CDSR performance for simple models from a novel perspective of enhancing the self-attention. Specifically, we introduce a Pareto-optimal self-attention and formulate the cross-domain learning as a multi-objective problem, where we optimize the recommendation task while dynamically minimizing the cross-domain attention scores. Our approach automates knowledge transfer in CDSR (dubbed as AutoCDSR) -- it not only mitigates negative transfer but also encourages complementary knowledge exchange among auxiliary domains. Based on the idea, we further introduce AutoCDSR+, a more performant variant with slight additional cost. Our proposal is easy to implement and works as a plug-and-play module that can be incorporated into existing transformer-based recommenders. Besides flexibility, it is practical to deploy because it brings little extra computational overheads without heavy hyper-parameter tuning. AutoCDSR on average improves Recall@10 for SASRec and Bert4Rec by 9.8% and 16.0% and NDCG@10 by 12.0% and 16.7%, respectively. Code is available atthis https URL.

View on arXiv
@article{ju2025_2505.21811,
  title={ Revisiting Self-attention for Cross-domain Sequential Recommendation },
  author={ Clark Mingxuan Ju and Leonardo Neves and Bhuvesh Kumar and Liam Collins and Tong Zhao and Yuwei Qiu and Qing Dou and Sohail Nizam and Sen Yang and Neil Shah },
  journal={arXiv preprint arXiv:2505.21811},
  year={ 2025 }
}
Comments on this paper