Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2407.18990
Cited By
Stay Tuned: An Empirical Study of the Impact of Hyperparameters on LLM Tuning in Real-World Applications
25 July 2024
Alon Halfon
Shai Gretz
Ofir Arviv
Artem Spector
Orith Toledo-Ronen
Yoav Katz
L. Ein-Dor
Michal Shmueli-Scheuer
Noam Slonim
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Stay Tuned: An Empirical Study of the Impact of Hyperparameters on LLM Tuning in Real-World Applications"
3 / 3 papers shown
Title
LoRA Land: 310 Fine-tuned LLMs that Rival GPT-4, A Technical Report
Justin Zhao
Timothy Wang
Wael Abid
Geoffrey Angus
Arnav Garg
Jeffery Kinnison
Alex Sherstinsky
Piero Molino
Travis Addair
Devvret Rishi
ALM
48
28
0
29 Apr 2024
LexGLUE: A Benchmark Dataset for Legal Language Understanding in English
Ilias Chalkidis
Abhik Jana
D. Hartung
M. Bommarito
Ion Androutsopoulos
Daniel Martin Katz
Nikolaos Aletras
AILaw
ELM
130
249
0
03 Oct 2021
Efficient Intent Detection with Dual Sentence Encoders
I. Casanueva
Tadas Temvcinas
D. Gerz
Matthew Henderson
Ivan Vulić
VLM
180
453
0
10 Mar 2020
1