Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2408.01008
Cited By
Tensor Train Low-rank Approximation (TT-LoRA): Democratizing AI with Accelerated LLMs
2 August 2024
Afia Anjum
Maksim E. Eren
V. Setlur
Boian Alexandrov
Manish Bhattarai
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Tensor Train Low-rank Approximation (TT-LoRA): Democratizing AI with Accelerated LLMs"
5 / 5 papers shown
Title
TT-LoRA MoE: Unifying Parameter-Efficient Fine-Tuning and Sparse Mixture-of-Experts
Pradip Kunwar
Minh Vu
Maanak Gupta
Mahmoud Abdelsalam
Manish Bhattarai
MoE
MoMe
187
0
0
29 Apr 2025
Tensor Networks Meet Neural Networks: A Survey and Future Perspectives
Maolin Wang
Yu Pan
Zenglin Xu
Xiangli Yang
Guangxi Li
A. Cichocki
Andrzej Cichocki
58
19
0
22 Jan 2023
P-Tuning v2: Prompt Tuning Can Be Comparable to Fine-tuning Universally Across Scales and Tasks
Xiao Liu
Kaixuan Ji
Yicheng Fu
Weng Lam Tam
Zhengxiao Du
Zhilin Yang
Jie Tang
VLM
238
808
0
14 Oct 2021
The Power of Scale for Parameter-Efficient Prompt Tuning
Brian Lester
Rami Al-Rfou
Noah Constant
VPVLM
280
3,872
0
18 Apr 2021
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
299
6,984
0
20 Apr 2018
1