ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2109.11928
  4. Cited By
Is the Number of Trainable Parameters All That Actually Matters?

Is the Number of Trainable Parameters All That Actually Matters?

24 September 2021
A. Chatelain
Amine Djeghri
Daniel Hesslow
Julien Launay
Iacopo Poli
ArXivPDFHTML

Papers citing "Is the Number of Trainable Parameters All That Actually Matters?"

5 / 5 papers shown
Title
A Moonshot for AI Oracles in the Sciences
A Moonshot for AI Oracles in the Sciences
Bryan Kaiser
Tailin Wu
Maike Sonnewald
Colin Thackray
Skylar Callis
AI4CE
51
0
0
25 Jun 2024
A Comprehensive Evaluation of Parameter-Efficient Fine-Tuning on
  Software Engineering Tasks
A Comprehensive Evaluation of Parameter-Efficient Fine-Tuning on Software Engineering Tasks
Wentao Zou
Qi Li
Jidong Ge
Chuanyi Li
Xiaoyu Shen
LiGuo Huang
Bin Luo
24
5
0
25 Dec 2023
Scaling Laws Beyond Backpropagation
Scaling Laws Beyond Backpropagation
Matthew J. Filipovich
Alessandro Cappelli
Daniel Hesslow
Julien Launay
19
3
0
26 Oct 2022
What Changes Can Large-scale Language Models Bring? Intensive Study on
  HyperCLOVA: Billions-scale Korean Generative Pretrained Transformers
What Changes Can Large-scale Language Models Bring? Intensive Study on HyperCLOVA: Billions-scale Korean Generative Pretrained Transformers
Boseop Kim
Hyoungseok Kim
Sang-Woo Lee
Gichang Lee
Donghyun Kwak
...
Jaewook Kang
Inho Kang
Jung-Woo Ha
W. Park
Nako Sung
VLM
243
121
0
10 Sep 2021
Scaling Laws for Neural Language Models
Scaling Laws for Neural Language Models
Jared Kaplan
Sam McCandlish
T. Henighan
Tom B. Brown
B. Chess
R. Child
Scott Gray
Alec Radford
Jeff Wu
Dario Amodei
228
4,460
0
23 Jan 2020
1