Quantum-assisted learning of hardware-embedded probabilistic graphical models

Mainstream machine learning techniques such as deep learning and probabilistic programming rely heavily on sampling from generally intractable probability distributions. There is increasing interest in the potential advantages of using quantum computing technologies as sampling engines to speed up these tasks or to make them more effective. However, some pressing challenges in state-of-the-art quantum annealers have to be overcome before we can assess their actual performance. The sparse connectivity, resulting from the local interaction between quantum bits in physical hardware implementations, is considered the most severe limitation to the quality of constructing powerful generative unsupervised machine learning models. Here we use embedding techniques to add redundancy to datasets allowing to increase the modeling capacity of quantum annealers. We illustrate our findings by training hardware-embedded graphical models on a binarized dataset of handwritten digits and two synthetic datasets in experiments with up to quantum bits. Our model can be trained in quantum hardware without full knowledge of the effective parameters specifying the corresponding quantum Gibbs-like distribution; therefore, this approach avoids the need to infer the effective temperature at each iteration, speeding up learning, and it mitigates the effect of noise in the control parameters, making it robust to deviations from the reference Gibbs distribution. Our approach demonstrates the feasibility of using quantum annealers for implementing generative models and provides a suitable framework for benchmarking these quantum technologies on machine-learning-related tasks.
View on arXiv