ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.14355
65
0

Triply Laplacian Scale Mixture Modeling for Seismic Data Noise Suppression

21 February 2025
Sirui Pan
Zhiyuan Zha
Shuaiqiang Wang
Yue Li
Zipei Fan
Gang Yan
Binh T. Nguyen
Bihan Wen
Ce Zhu
ArXivPDFHTML
Abstract

Sparsity-based tensor recovery methods have shown great potential in suppressing seismic data noise. These methods exploit tensor sparsity measures capturing the low-dimensional structures inherent in seismic data tensors to remove noise by applying sparsity constraints through soft-thresholding or hard-thresholding operators. However, in these methods, considering that real seismic data are non-stationary and affected by noise, the variances of tensor coefficients are unknown and may be difficult to accurately estimate from the degraded seismic data, leading to undesirable noise suppression performance. In this paper, we propose a novel triply Laplacian scale mixture (TLSM) approach for seismic data noise suppression, which significantly improves the estimation accuracy of both the sparse tensor coefficients and hidden scalar parameters. To make the optimization problem manageable, an alternating direction method of multipliers (ADMM) algorithm is employed to solve the proposed TLSM-based seismic data noise suppression problem. Extensive experimental results on synthetic and field seismic data demonstrate that the proposed TLSM algorithm outperforms many state-of-the-art seismic data noise suppression methods in both quantitative and qualitative evaluations while providing exceptional computational efficiency.

View on arXiv
@article{pan2025_2502.14355,
  title={ Triply Laplacian Scale Mixture Modeling for Seismic Data Noise Suppression },
  author={ Sirui Pan and Zhiyuan Zha and Shigang Wang and Yue Li and Zipei Fan and Gang Yan and Binh T. Nguyen and Bihan Wen and Ce Zhu },
  journal={arXiv preprint arXiv:2502.14355},
  year={ 2025 }
}
Comments on this paper