ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2409.03621
  4. Cited By
Attend First, Consolidate Later: On the Importance of Attention in
  Different LLM Layers

Attend First, Consolidate Later: On the Importance of Attention in Different LLM Layers

5 September 2024
Amit Ben Artzy
Roy Schwartz
ArXivPDFHTML

Papers citing "Attend First, Consolidate Later: On the Importance of Attention in Different LLM Layers"

6 / 6 papers shown
Title
Spotlight Your Instructions: Instruction-following with Dynamic Attention Steering
Spotlight Your Instructions: Instruction-following with Dynamic Attention Steering
Praveen Venkateswaran
Danish Contractor
LLMSV
LRM
16
0
0
17 May 2025
Empowering GraphRAG with Knowledge Filtering and Integration
Empowering GraphRAG with Knowledge Filtering and Integration
Kai Guo
Harry Shomer
Shenglai Zeng
Haoyu Han
Yu Wang
Jiliang Tang
61
0
0
18 Mar 2025
AttentionRAG: Attention-Guided Context Pruning in Retrieval-Augmented Generation
Yixiong Fang
Tianran Sun
Yuling Shi
Xiaodong Gu
61
0
0
13 Mar 2025
Looking Beyond The Top-1: Transformers Determine Top Tokens In Order
Looking Beyond The Top-1: Transformers Determine Top Tokens In Order
Daria Lioubashevski
Tomer Schlank
Gabriel Stanovsky
Ariel Goldstein
34
1
0
26 Oct 2024
From Tokens to Words: On the Inner Lexicon of LLMs
From Tokens to Words: On the Inner Lexicon of LLMs
Guy Kaplan
Matanel Oren
Yuval Reif
Roy Schwartz
48
12
0
08 Oct 2024
Teaching Machines to Read and Comprehend
Teaching Machines to Read and Comprehend
Karl Moritz Hermann
Tomás Kociský
Edward Grefenstette
L. Espeholt
W. Kay
Mustafa Suleyman
Phil Blunsom
184
3,510
0
10 Jun 2015
1