2
0

Cross-Model Transfer of Task Vectors via Few-Shot Orthogonal Alignment

Abstract

Task arithmetic enables efficient model editing by representing task-specific changes as vectors in parameter space. Task arithmetic typically assumes that the source and target models are initialized from the same pre-trained parameters. This assumption limits its applicability in cross-model transfer settings, where models are independently pre-trained on different datasets. To address this challenge, we propose a method based on few-shot orthogonal alignment, which aligns task vectors to the parameter space of a differently pre-trained target model. These transformations preserve key properties of task vectors, such as norm and rank, and are learned using only a small number of labeled examples. We evaluate the method using two Vision Transformers pre-trained on YFCC100M and LAION400M, and test on eight classification datasets. Experimental results show that our method improves transfer accuracy over direct task vector application and achieves performance comparable to few-shot fine-tuning, while maintaining the modularity and reusability of task vectors. Our code is available atthis https URL.

View on arXiv
@article{kawamoto2025_2505.12021,
  title={ Cross-Model Transfer of Task Vectors via Few-Shot Orthogonal Alignment },
  author={ Kazuhiko Kawamoto and Atsuhiro Endo and Hiroshi Kera },
  journal={arXiv preprint arXiv:2505.12021},
  year={ 2025 }
}
Comments on this paper