ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2405.18220
57
1

Non-negative Tensor Mixture Learning for Discrete Density Estimation

28 May 2024
Kazu Ghalamkari
Jesper L. Hinrich
Morten Mørup
ArXivPDFHTML
Abstract

We present an expectation-maximization (EM) based unified framework for non-negative tensor decomposition that optimizes the Kullback-Leibler divergence. To avoid iterations in each M-step and learning rate tuning, we establish a general relationship between low-rank decompositions and many-body approximations. Using this connection, we exploit that the closed-form solution of the many-body approximation updates all parameters simultaneously in the M-step. Our framework offers not only a unified methodology for a variety of low-rank structures, including CP, Tucker, and Tensor Train decompositions, but also their mixtures. Notably, the weights of each low-rank tensor in the mixture can be learned from the data, which enables us to leverage the advantage of different low-rank structures without careful selection of the structure in advance. We empirically demonstrate that our framework overall provides superior generalization in terms of discrete density estimation and classification when compared to conventional tensor-based approaches.

View on arXiv
@article{ghalamkari2025_2405.18220,
  title={ Non-negative Tensor Mixture Learning for Discrete Density Estimation },
  author={ Kazu Ghalamkari and Jesper Løve Hinrich and Morten Mørup },
  journal={arXiv preprint arXiv:2405.18220},
  year={ 2025 }
}
Comments on this paper