ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1902.08380
13
5

Unique Sharp Local Minimum in ℓ1\ell_1ℓ1​-minimization Complete Dictionary Learning

22 February 2019
Yu Wang
Siqi Wu
Bin Yu
ArXivPDFHTML
Abstract

We study the problem of globally recovering a dictionary from a set of signals via ℓ1\ell_1ℓ1​-minimization. We assume that the signals are generated as i.i.d. random linear combinations of the KKK atoms from a complete reference dictionary D∗∈RK×KD^*\in \mathbb R^{K\times K}D∗∈RK×K, where the linear combination coefficients are from either a Bernoulli type model or exact sparse model. First, we obtain a necessary and sufficient norm condition for the reference dictionary D∗D^*D∗ to be a sharp local minimum of the expected ℓ1\ell_1ℓ1​ objective function. Our result substantially extends that of Wu and Yu (2015) and allows the combination coefficient to be non-negative. Secondly, we obtain an explicit bound on the region within which the objective value of the reference dictionary is minimal. Thirdly, we show that the reference dictionary is the unique sharp local minimum, thus establishing the first known global property of ℓ1\ell_1ℓ1​-minimization dictionary learning. Motivated by the theoretical results, we introduce a perturbation-based test to determine whether a dictionary is a sharp local minimum of the objective function. In addition, we also propose a new dictionary learning algorithm based on Block Coordinate Descent, called DL-BCD, which is guaranteed to have monotonic convergence. Simulation studies show that DL-BCD has competitive performance in terms of recovery rate compared to many state-of-the-art dictionary learning algorithms.

View on arXiv
Comments on this paper