58
0

EDCFlow: Exploring Temporally Dense Difference Maps for Event-based Optical Flow Estimation

Main:8 Pages
8 Figures
Bibliography:2 Pages
12 Tables
Appendix:4 Pages
Abstract

Recent learning-based methods for event-based optical flow estimation utilize cost volumes for pixel matching but suffer from redundant computations and limited scalability to higher resolutions for flow refinement. In this work, we take advantage of the complementarity between temporally dense feature differences of adjacent event frames and cost volume and present a lightweight event-based optical flow network (EDCFlow) to achieve high-quality flow estimation at a higher resolution. Specifically, an attention-based multi-scale temporal feature difference layer is developed to capture diverse motion patterns at high resolution in a computation-efficient manner. An adaptive fusion of high-resolution difference motion features and low-resolution correlation motion features is performed to enhance motion representation and model generalization. Notably, EDCFlow can serve as a plug-and-play refinement module for RAFT-like event-based methods to enhance flow details. Extensive experiments demonstrate that EDCFlow achieves better performance with lower complexity compared to existing methods, offering superior generalization.

View on arXiv
@article{liu2025_2506.03512,
  title={ EDCFlow: Exploring Temporally Dense Difference Maps for Event-based Optical Flow Estimation },
  author={ Daikun Liu and Lei Cheng and Teng Wang and changyin Sun },
  journal={arXiv preprint arXiv:2506.03512},
  year={ 2025 }
}
Comments on this paper