7
0
v1v2 (latest)

Remote Sensing Image Classification with Decoupled Knowledge Distillation

Main:8 Pages
2 Figures
2 Tables
Abstract

To address the challenges posed by the large number of parameters in existing remote sensing image classification models, which hinder deployment on resource-constrained devices, this paper proposes a lightweight classification method based on knowledge distillation. Specifically, G-GhostNet is adopted as the backbone network, leveraging feature reuse to reduce redundant parameters and significantly improve inference efficiency. In addition, a decoupled knowledge distillation strategy is employed, which separates target and non-target classes to effectively enhance classification accuracy. Experimental results on the RSOD and AID datasets demonstrate that, compared with the high-parameter VGG-16 model, the proposed method achieves nearly equivalent Top-1 accuracy while reducing the number of parameters by 6.24 times. This approach strikes an excellent balance between model size and classification performance, offering an efficient solution for deployment on resource-limited devices.

View on arXiv
@article{he2025_2505.19111,
  title={ Remote Sensing Image Classification with Decoupled Knowledge Distillation },
  author={ Yaping He and Jianfeng Cai and Qicong Hu and Peiqing Wang },
  journal={arXiv preprint arXiv:2505.19111},
  year={ 2025 }
}
Comments on this paper