ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.02572
  4. Cited By
HATA: Trainable and Hardware-Efficient Hash-Aware Top-k Attention for Scalable Large Model Inference

HATA: Trainable and Hardware-Efficient Hash-Aware Top-k Attention for Scalable Large Model Inference

3 June 2025
Ping Gong
Jiawei Yi
Shengnan Wang
Juncheng Zhang
Zewen Jin
Ouxiang Zhou
Ruibo Liu
Guanbin Xu
Youhui Bai
Bowen Ye
Kun Yuan
Tong Yang
Gong Zhang
Renhai Chen
Feng Wu
Cheng Li
ArXiv (abs)PDFHTML

Papers citing "HATA: Trainable and Hardware-Efficient Hash-Aware Top-k Attention for Scalable Large Model Inference"

Title
No papers