ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2109.13860
11
0

Introduce the Result Into Self-Attention

21 September 2021
Chengcheng Ye
ArXivPDFHTML
Abstract

Traditional self-attention mechanisms in convolutional networks tend to use only the output of the previous layer as input to the attention network, such as SENet, CBAM, etc. In this paper, we propose a new attention modification method that tries to get the output of the classification network in advance and use it as a part of the input of the attention network. We used the auxiliary classifier proposed in GoogLeNet to obtain the results in advance and pass them into attention networks. we added this mechanism to SE-ResNet for our experiments and achieved a classification accuracy improvement of at most 1.94% on cifar100.

View on arXiv
Comments on this paper