ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.14839
19
0

Exploring ℓ0\ell_0ℓ0​ Sparsification for Inference-free Sparse Retrievers

21 April 2025
Xinjie Shen
Zhichao Geng
Yang Yang
ArXivPDFHTML
Abstract

With increasing demands for efficiency, information retrieval has developed a branch of sparse retrieval, further advancing towards inference-free retrieval where the documents are encoded during indexing time and there is no model-inference for queries. Existing sparse retrieval models rely on FLOPS regularization for sparsification, while this mechanism was originally designed for Siamese encoders, it is considered to be suboptimal in inference-free scenarios which is asymmetric. Previous attempts to adapt FLOPS for inference-free scenarios have been limited to rule-based methods, leaving the potential of sparsification approaches for inference-free retrieval models largely unexplored. In this paper, we explore ℓ0\ell_0ℓ0​ inspired sparsification manner for inference-free retrievers. Through comprehensive out-of-domain evaluation on the BEIR benchmark, our method achieves state-of-the-art performance among inference-free sparse retrieval models and is comparable to leading Siamese sparse retrieval models. Furthermore, we provide insights into the trade-off between retrieval effectiveness and computational efficiency, demonstrating practical value for real-world applications.

View on arXiv
@article{shen2025_2504.14839,
  title={ Exploring $\ell_0$ Sparsification for Inference-free Sparse Retrievers },
  author={ Xinjie Shen and Zhichao Geng and Yang Yang },
  journal={arXiv preprint arXiv:2504.14839},
  year={ 2025 }
}
Comments on this paper