ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.23709
54
0

Expanding-and-Shrinking Binary Neural Networks

31 March 2025
Xulong Shi
Caiyi Sun
Zhi Qi
Liu Hao
Xiaodong Yang
    MQ
ArXivPDFHTML
Abstract

While binary neural networks (BNNs) offer significant benefits in terms of speed, memory and energy, they encounter substantial accuracy degradation in challenging tasks compared to their real-valued counterparts. Due to the binarization of weights and activations, the possible values of each entry in the feature maps generated by BNNs are strongly constrained. To tackle this limitation, we propose the expanding-and-shrinking operation, which enhances binary feature maps with negligible increase of computation complexity, thereby strengthening the representation capacity. Extensive experiments conducted on multiple benchmarks reveal that our approach generalizes well across diverse applications ranging from image classification, object detection to generative diffusion model, while also achieving remarkable improvement over various leading binarization algorithms based on different architectures including both CNNs and Transformers.

View on arXiv
@article{shi2025_2503.23709,
  title={ Expanding-and-Shrinking Binary Neural Networks },
  author={ Xulong Shi and Caiyi Sun and Zhi Qi and Liu Hao and Xiaodong Yang },
  journal={arXiv preprint arXiv:2503.23709},
  year={ 2025 }
}
Comments on this paper