10
0

HOFT: Householder Orthogonal Fine-tuning

Abstract

Adaptation of foundation models using low-rank methods is a widespread approach. Another way to adapt these models is to employ orthogonal fine-tuning methods, which are less time and memory efficient despite their good generalization properties. In this work, we propose Householder Orthogonal Fine-tuning (HOFT), a novel orthogonal fine-tuning method that aims to alleviate time and space complexity. Moreover, some theoretical properties of the orthogonal fine-tuning paradigm are explored. From this exploration, Scaled Householder Orthogonal Fine-tuning (SHOFT) is proposed. Both HOFT and SHOFT are evaluated in downstream tasks, namely commonsense reasoning, machine translation, subject-driven generation and mathematical reasoning. Compared with state-of-the-art adaptation methods, HOFT and SHOFT show comparable or better results.

View on arXiv
@article{arcas2025_2505.16531,
  title={ HOFT: Householder Orthogonal Fine-tuning },
  author={ Alejandro Moreno Arcas and Albert Sanchis and Jorge Civera and Alfons Juan },
  journal={arXiv preprint arXiv:2505.16531},
  year={ 2025 }
}
Comments on this paper