We introduce the problem of simultaneously learning all powers of a Poisson Binomial Distribution (PBD). A PBD of order is the distribution of a sum of mutually independent Bernoulli random variables , where . The 'th power of this distribution, for in a range , is the distribution of , where each Bernoulli random variable has . The learning algorithm can query any power several times and succeeds in learning all powers in the range, if with probability at least : given any , it returns a probability distribution with total variation distance from at most . We provide almost matching lower and upper bounds on query complexity for this problem. We first show a lower bound on the query complexity on PBD powers instances with many distinct parameters which are separated, and we almost match this lower bound by examining the query complexity of simultaneously learning all the powers of a special class of PBD's resembling the PBD's of our lower bound. We study the fundamental setting of a Binomial distribution, and provide an optimal algorithm which uses samples. Diakonikolas, Kane and Stewart [COLT'16] showed a lower bound of samples to learn the 's within error . The question whether sampling from powers of PBDs can reduce this sampling complexity, has a negative answer since we show that the exponential number of samples is inevitable. Having sampling access to the powers of a PBD we then give a nearly optimal algorithm that learns its 's. To prove our two last lower bounds we extend the classical minimax risk definition from statistics to estimating functions of sequences of distributions.
View on arXiv