0
0

Advancing Loss Functions in Recommender Systems: A Comparative Study with a Rényi Divergence-Based Solution

Shengjia Zhang
Jiawei Chen
Changdong Li
Sheng Zhou
Qihao Shi
Yan Feng
Chun Chen
Can Wang
Main:7 Pages
5 Figures
Bibliography:3 Pages
9 Tables
Appendix:6 Pages
Abstract

Loss functions play a pivotal role in optimizing recommendation models. Among various loss functions, Softmax Loss (SL) and Cosine Contrastive Loss (CCL) are particularly effective. Their theoretical connections and differences warrant in-depth exploration. This work conducts comprehensive analyses of these losses, yielding significant insights: 1) Common strengths -- both can be viewed as augmentations of traditional losses with Distributional Robust Optimization (DRO), enhancing robustness to distributional shifts; 2) Respective limitations -- stemming from their use of different distribution distance metrics in DRO optimization, SL exhibits high sensitivity to false negative instances, whereas CCL suffers from low data utilization. To address these limitations, this work proposes a new loss function, DrRL, which generalizes SL and CCL by leveraging Rényi-divergence in DRO optimization. DrRL incorporates the advantageous structures of both SL and CCL, and can be demonstrated to effectively mitigate their limitations. Extensive experiments have been conducted to validate the superiority of DrRL on both recommendation accuracy and robustness.

View on arXiv
@article{zhang2025_2506.15120,
  title={ Advancing Loss Functions in Recommender Systems: A Comparative Study with a Rényi Divergence-Based Solution },
  author={ Shengjia Zhang and Jiawei Chen and Changdong Li and Sheng Zhou and Qihao Shi and Yan Feng and Chun Chen and Can Wang },
  journal={arXiv preprint arXiv:2506.15120},
  year={ 2025 }
}
Comments on this paper