25
0
v1v2 (latest)

Low-resource domain adaptation while minimizing energy and hardware resource consumption

Abstract

Training Large Language Models (LLMs) is costly in terms of energy, hardware, and annotated data, often resulting in a positionality rooted in predominant cultures and values (Santy et al., 2023). Domain adaptation has emerged as a promising strategy to better align models with diverse cultural and value contexts (Hershcovich et al., 2022), but its computational cost remains a significant barrier, particularly for research groups lacking access to large-scale infrastructure. In this paper, we evaluate how the use of different numerical precision formats and data parallelization strategies impacts both training speed (as a proxy to energy and hardware consumption) and model accuracy, with the goal of facilitating domain adaptation in low-resource environments. Our findings are relevant to any setting where energy efficiency, accessibility, or limited hardware availability are key concerns.

View on arXiv
@article{maina2025_2506.08433,
  title={ Low-resource domain adaptation while minimizing energy and hardware resource consumption },
  author={ Hernán Maina and Nicolás Wolovick and Luciana Benotti },
  journal={arXiv preprint arXiv:2506.08433},
  year={ 2025 }
}
Comments on this paper