ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1910.11721
50
19

Learning Mixtures of Plackett-Luce Models from Structured Partial Orders

25 October 2019
Zhibing Zhao
Lirong Xia
ArXivPDFHTML
Abstract

Mixtures of ranking models have been widely used for heterogeneous preferences. However, learning a mixture model is highly nontrivial, especially when the dataset consists of partial orders. In such cases, the parameter of the model may not be even identifiable. In this paper, we focus on three popular structures of partial orders: ranked top-l1l_1l1​, l2l_2l2​-way, and choice data over a subset of alternatives. We prove that when the dataset consists of combinations of ranked top-l1l_1l1​ and l2l_2l2​-way (or choice data over up to l2l_2l2​ alternatives), mixture of kkk Plackett-Luce models is not identifiable when l1+l2≤2k−1l_1+l_2\le 2k-1l1​+l2​≤2k−1 (l2l_2l2​ is set to 111 when there are no l2l_2l2​-way orders). We also prove that under some combinations, including ranked top-333, ranked top-222 plus 222-way, and choice data over up to 444 alternatives, mixtures of two Plackett-Luce models are identifiable. Guided by our theoretical results, we propose efficient generalized method of moments (GMM) algorithms to learn mixtures of two Plackett-Luce models, which are proven consistent. Our experiments demonstrate the efficacy of our algorithms. Moreover, we show that when full rankings are available, learning from different marginal events (partial orders) provides tradeoffs between statistical efficiency and computational efficiency.

View on arXiv
Comments on this paper