ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.06502
34
0

PC-SRGAN: Physically Consistent Super-Resolution Generative Adversarial Network for General Transient Simulations

10 May 2025
Md Rakibul Hasan
Pouria Behnoudfar
Dan MacKinlay
Thomas Poulet
    GAN
ArXivPDFHTML
Abstract

Machine Learning, particularly Generative Adversarial Networks (GANs), has revolutionised Super Resolution (SR). However, generated images often lack physical meaningfulness, which is essential for scientific applications. Our approach, PC-SRGAN, enhances image resolution while ensuring physical consistency for interpretable simulations. PC-SRGAN significantly improves both the Peak Signal-to-Noise Ratio and the Structural Similarity Index Measure compared to conventional methods, even with limited training data (e.g., only 13% of training data required for SRGAN). Beyond SR, PC-SRGAN augments physically meaningful machine learning, incorporating numerically justified time integrators and advanced quality metrics. These advancements promise reliable and causal machine-learning models in scientific domains. A significant advantage of PC-SRGAN over conventional SR techniques is its physical consistency, which makes it a viable surrogate model for time-dependent problems. PC-SRGAN advances scientific machine learning, offering improved accuracy and efficiency for image processing, enhanced process understanding, and broader applications to scientific research. The source codes and data will be made publicly available atthis https URLupon acceptance of this paper.

View on arXiv
@article{hasan2025_2505.06502,
  title={ PC-SRGAN: Physically Consistent Super-Resolution Generative Adversarial Network for General Transient Simulations },
  author={ Md Rakibul Hasan and Pouria Behnoudfar and Dan MacKinlay and Thomas Poulet },
  journal={arXiv preprint arXiv:2505.06502},
  year={ 2025 }
}
Comments on this paper