ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2103.00879
11
38

DR-TANet: Dynamic Receptive Temporal Attention Network for Street Scene Change Detection

1 March 2021
Shuo Chen
Kailun Yang
Rainer Stiefelhagen
ArXivPDFHTML
Abstract

Street scene change detection continues to capture researchers' interests in the computer vision community. It aims to identify the changed regions of the paired street-view images captured at different times. The state-of-the-art network based on the encoder-decoder architecture leverages the feature maps at the corresponding level between two channels to gain sufficient information of changes. Still, the efficiency of feature extraction, feature correlation calculation, even the whole network requires further improvement. This paper proposes the temporal attention and explores the impact of the dependency-scope size of temporal attention on the performance of change detection. In addition, based on the Temporal Attention Module (TAM), we introduce a more efficient and light-weight version - Dynamic Receptive Temporal Attention Module (DRTAM) and propose the Concurrent Horizontal and Vertical Attention (CHVA) to improve the accuracy of the network on specific challenging entities. On street scene datasets `GSV', `TSUNAMI' and `VL-CMU-CD', our approach gains excellent performance, establishing new state-of-the-art scores without bells and whistles, while maintaining high efficiency applicable in autonomous vehicles.

View on arXiv
Comments on this paper