ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2309.08646
  4. Cited By
CoCA: Fusing Position Embedding with Collinear Constrained Attention in
  Transformers for Long Context Window Extending

CoCA: Fusing Position Embedding with Collinear Constrained Attention in Transformers for Long Context Window Extending

15 September 2023
Shiyi Zhu
Jingting Ye
Wei Jiang
Siqiao Xue
Qi Zhang
Yifan Wu
Jianguo Li
ArXivPDFHTML

Papers citing "CoCA: Fusing Position Embedding with Collinear Constrained Attention in Transformers for Long Context Window Extending"

4 / 4 papers shown
Title
Shifting Long-Context LLMs Research from Input to Output
Yuhao Wu
Yushi Bai
Zhiqing Hu
Shangqing Tu
Ming Shan Hee
Juanzi Li
Roy Ka-Wei Lee
65
1
0
06 Mar 2025
PEER: Expertizing Domain-Specific Tasks with a Multi-Agent Framework and
  Tuning Methods
PEER: Expertizing Domain-Specific Tasks with a Multi-Agent Framework and Tuning Methods
Yiying Wang
Xiaojing Li
Binzhu Wang
Yueyang Zhou
Yingru Lin
...
Fei Yu
Zewei Zhao
Song Jin
Renji Gong
Wanqing Xu
27
1
0
09 Jul 2024
Train Short, Test Long: Attention with Linear Biases Enables Input
  Length Extrapolation
Train Short, Test Long: Attention with Linear Biases Enables Input Length Extrapolation
Ofir Press
Noah A. Smith
M. Lewis
253
698
0
27 Aug 2021
The Pile: An 800GB Dataset of Diverse Text for Language Modeling
The Pile: An 800GB Dataset of Diverse Text for Language Modeling
Leo Gao
Stella Biderman
Sid Black
Laurence Golding
Travis Hoppe
...
Horace He
Anish Thite
Noa Nabeshima
Shawn Presser
Connor Leahy
AIMat
282
2,000
0
31 Dec 2020
1