We prove that every online learnable class of functions of Littlestone dimension admits a learning algorithm with finite information complexity. Towards this end, we use the notion of a globally stable algorithm. Generally, the information complexity of such a globally stable algorithm is large yet finite, roughly exponential in . We also show there is room for improvement; for a canonical online learnable class, indicator functions of affine subspaces of dimension , the information complexity can be upper bounded logarithmically in .
View on arXiv