ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2103.13487
27
6

Entropy Minimizing Matrix Factorization

24 March 2021
Mulin. Chen
Xuelong Li
ArXivPDFHTML
Abstract

Nonnegative Matrix Factorization (NMF) is a widely-used data analysis technique, and has yielded impressive results in many real-world tasks. Generally, existing NMF methods represent each sample with several centroids, and find the optimal centroids by minimizing the sum of the approximation errors. However, the outliers deviating from the normal data distribution may have large residues, and then dominate the objective value seriously. In this study, an Entropy Minimizing Matrix Factorization framework (EMMF) is developed to tackle the above problem. Considering that the outliers are usually much less than the normal samples, a new entropy loss function is established for matrix factorization, which minimizes the entropy of the residue distribution and allows a few samples to have large approximation errors. In this way, the outliers do not affect the approximation of the normal samples. The multiplicative updating rules for EMMF are also designed, and the convergence is proved both theoretically and experimentally. In addition, a Graph regularized version of EMMF (G-EMMF) is also presented to deal with the complex data structure. Clustering results on various synthetic and real-world datasets demonstrate the reasonableness of the proposed models, and the effectiveness is also verified through the comparison with the state-of-the-arts.

View on arXiv
Comments on this paper