ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2204.05149
  4. Cited By
The Carbon Footprint of Machine Learning Training Will Plateau, Then
  Shrink

The Carbon Footprint of Machine Learning Training Will Plateau, Then Shrink

11 April 2022
David A. Patterson
Joseph E. Gonzalez
Urs Holzle
Quoc V. Le
Chen Liang
Lluís-Miquel Munguía
D. Rothchild
David R. So
Maud Texier
J. Dean
    AI4CE
ArXiv (abs)PDFHTML

Papers citing "The Carbon Footprint of Machine Learning Training Will Plateau, Then Shrink"

14 / 114 papers shown
Title
Fast-FNet: Accelerating Transformer Encoder Models via Efficient Fourier
  Layers
Fast-FNet: Accelerating Transformer Encoder Models via Efficient Fourier Layers
Nurullah Sevim
Ege Ozan Özyedek
Furkan Şahinuç
Aykut Koç
85
12
0
26 Sep 2022
Algorithmic decision making methods for fair credit scoring
Algorithmic decision making methods for fair credit scoring
Darie Moldovan
FaML
77
7
0
16 Sep 2022
Learning sparse auto-encoders for green AI image coding
Learning sparse auto-encoders for green AI image coding
Cyprien Gille
F. Guyard
Marc Antonini
Michel Barlaud
46
3
0
09 Sep 2022
What Do NLP Researchers Believe? Results of the NLP Community Metasurvey
What Do NLP Researchers Believe? Results of the NLP Community Metasurvey
Julian Michael
Ari Holtzman
Alicia Parrish
Aaron Mueller
Alex Jinpeng Wang
...
Divyam Madaan
Nikita Nangia
Richard Yuanzhe Pang
Jason Phang
Sam Bowman
71
39
0
26 Aug 2022
POET: Training Neural Networks on Tiny Devices with Integrated
  Rematerialization and Paging
POET: Training Neural Networks on Tiny Devices with Integrated Rematerialization and Paging
Shishir G. Patil
Paras Jain
P. Dutta
Ion Stoica
Joseph E. Gonzalez
79
37
0
15 Jul 2022
On the Principles of Parsimony and Self-Consistency for the Emergence of
  Intelligence
On the Principles of Parsimony and Self-Consistency for the Emergence of Intelligence
Yi Ma
Doris Y. Tsao
H. Shum
142
78
0
11 Jul 2022
Sustainable Computing -- Without the Hot Air
Sustainable Computing -- Without the Hot Air
Noman Bashir
David Irwin
Prashant J. Shenoy
Abel Souza
110
16
0
30 Jun 2022
Measuring the Carbon Intensity of AI in Cloud Instances
Measuring the Carbon Intensity of AI in Cloud Instances
Jesse Dodge
Taylor Prewitt
Rémi Tachet des Combes
Erika Odmark
Roy Schwartz
Emma Strubell
A. Luccioni
Noah A. Smith
Nicole DeCario
Will Buchanan
90
199
0
10 Jun 2022
Towards Climate Awareness in NLP Research
Towards Climate Awareness in NLP Research
Daniel Hershcovich
Nicolas Webersinke
Mathias Kraus
J. Bingler
Markus Leippold
123
34
0
10 May 2022
Resource-efficient domain adaptive pre-training for medical images
Resource-efficient domain adaptive pre-training for medical images
Y. Mehmood
U. I. Bajwa
Xianfang Sun
81
1
0
28 Apr 2022
Towards Green Automated Machine Learning: Status Quo and Future
  Directions
Towards Green Automated Machine Learning: Status Quo and Future Directions
Tanja Tornede
Alexander Tornede
Jonas Hanselle
Marcel Wever
F. Mohr
Eyke Hüllermeier
124
38
0
10 Nov 2021
Recent Advances in Natural Language Processing via Large Pre-Trained
  Language Models: A Survey
Recent Advances in Natural Language Processing via Large Pre-Trained Language Models: A Survey
Bonan Min
Hayley L Ross
Elior Sulem
Amir Pouran Ben Veyseh
Thien Huu Nguyen
Oscar Sainz
Eneko Agirre
Ilana Heinz
Dan Roth
LM&MAVLMAI4CE
195
1,094
0
01 Nov 2021
Can Federated Learning Save The Planet?
Can Federated Learning Save The Planet?
Xinchi Qiu
Titouan Parcollet
Daniel J. Beutel
Taner Topal
Akhil Mathur
Nicholas D. Lane
87
81
0
13 Oct 2020
The Computational Limits of Deep Learning
The Computational Limits of Deep Learning
Neil C. Thompson
Kristjan Greenewald
Keeheon Lee
Gabriel F. Manso
VLM
91
532
0
10 Jul 2020
Previous
123