29
0

DistrEE: Distributed Early Exit of Deep Neural Network Inference on Edge Devices

Abstract

Distributed DNN inference is becoming increasingly important as the demand for intelligent services at the network edge grows. By leveraging the power of distributed computing, edge devices can perform complicated and resource-hungry inference tasks previously only possible on powerful servers, enabling new applications in areas such as autonomous vehicles, industrial automation, and smart homes. However, it is challenging to achieve accurate and efficient distributed edge inference due to the fluctuating nature of the actual resources of the devices and the processing difficulty of the input data. In this work, we propose DistrEE, a distributed DNN inference framework that can exit model inference early to meet specific quality of service requirements. In particular, the framework firstly integrates model early exit and distributed inference for multi-node collaborative inferencing scenarios. Furthermore, it designs an early exit policy to control when the model inference terminates. Extensive simulation results demonstrate that DistrEE can efficiently realize efficient collaborative inference, achieving an effective trade-off between inference latency and accuracy.

View on arXiv
@article{peng2025_2502.15735,
  title={ DistrEE: Distributed Early Exit of Deep Neural Network Inference on Edge Devices },
  author={ Xian Peng and Xin Wu and Lianming Xu and Li Wang and Aiguo Fei },
  journal={arXiv preprint arXiv:2502.15735},
  year={ 2025 }
}
Comments on this paper