ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2001.03712
  4. Cited By
MHSAN: Multi-Head Self-Attention Network for Visual Semantic Embedding

MHSAN: Multi-Head Self-Attention Network for Visual Semantic Embedding

11 January 2020
Geondo Park
Chihye Han
Wonjun Yoon
Dae-Shik Kim
ArXiv (abs)PDFHTML

Papers citing "MHSAN: Multi-Head Self-Attention Network for Visual Semantic Embedding"

1 / 1 papers shown
Title
Paying More Attention to Self-attention: Improving Pre-trained Language
  Models via Attention Guiding
Paying More Attention to Self-attention: Improving Pre-trained Language Models via Attention Guiding
Shanshan Wang
Zhumin Chen
Zhaochun Ren
Huasheng Liang
Qiang Yan
Pengjie Ren
57
9
0
06 Apr 2022
1