22
44

Learning Mixtures of Plackett-Luce Models

Abstract

In this paper we address the identifiability and efficient learning problems of finite mixtures of Plackett-Luce models for rank data. We prove that for any k2k\geq 2, the mixture of kk Plackett-Luce models for no more than 2k12k-1 alternatives is non-identifiable and this bound is tight for k=2k=2. For generic identifiability, we prove that the mixture of kk Plackett-Luce models over mm alternatives is generically identifiable if km22!k\leq\lfloor\frac {m-2} 2\rfloor!. We also propose an efficient generalized method of moments (GMM) algorithm to learn the mixture of two Plackett-Luce models and show that the algorithm is consistent. Our experiments show that our GMM algorithm is significantly faster than the EMM algorithm by Gormley and Murphy (2008), while achieving competitive statistical efficiency.

View on arXiv
Comments on this paper