ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.06607
9
0

Training-Free Tokenizer Transplantation via Orthogonal Matching Pursuit

7 June 2025
Charles Goddard
Fernando Fernandes Neto
ArXiv (abs)PDFHTML
Main:11 Pages
1 Figures
Bibliography:4 Pages
7 Tables
Appendix:2 Pages
Abstract

We present a training-free method to transplant tokenizers in pretrained large language models (LLMs) by reconstructing unseen token embeddings via Orthogonal Matching Pursuit (OMP). Specifically, we approximate each out-of-vocabulary token as a sparse linear combination of shared tokens, in two phases: first, compute each new token's representation in the donor embedding space with a small dictionary of shared anchor tokens, then transfer these same sparse coefficients back into the base model's embedding space.On two challenging cross-tokenizer tasks--Llama→\to→Mistral NeMo (12B) and Qwen→\to→Llama (1B)--we show that OMP achieves best zero-shot preservation of the base model's performance across multiple benchmarks, while other zero-shot approaches degrade significantly. Compared to baselines (zero-init, mean-init, and existing approaches like WECHSEL, FOCUS, ZETT), OMP consistently achieves the best overall performance, effectively bridging large tokenizer discrepancies without gradient updates. Our analysis further identifies mismatched numerical tokenization schemes as a critical challenge for preserving mathematical reasoning capabilities. This technique enables direct reuse of pretrained model weights with new tokenizers, facilitating cross-tokenizer knowledge distillation, speculative decoding, ensembling, merging, and domain-specific vocabulary adaptations. We integrate our method into the open-source mergekit-tokensurgeon tool for post hoc vocabulary realignment.

View on arXiv
@article{goddard2025_2506.06607,
  title={ Training-Free Tokenizer Transplantation via Orthogonal Matching Pursuit },
  author={ Charles Goddard and Fernando Fernandes Neto },
  journal={arXiv preprint arXiv:2506.06607},
  year={ 2025 }
}
Comments on this paper