15
0

Distinct Computations Emerge From Compositional Curricula in In-Context Learning

Main:9 Pages
20 Figures
Bibliography:4 Pages
Appendix:8 Pages
Abstract

In-context learning (ICL) research often considers learning a function in-context through a uniform sample of input-output pairs. Here, we investigate how presenting a compositional subtask curriculum in context may alter the computations a transformer learns. We design a compositional algorithmic task based on the modular exponential-a double exponential task composed of two single exponential subtasks and train transformer models to learn the task in-context. We compare (a) models trained using an in-context curriculum consisting of single exponential subtasks and, (b) models trained directly on the double exponential task without such a curriculum. We show that models trained with a subtask curriculum can perform zero-shot inference on unseen compositional tasks and are more robust given the same context length. We study how the task and subtasks are represented across the two training regimes. We find that the models employ diverse strategies modulated by the specific curriculum design.

View on arXiv
@article{lee2025_2506.13253,
  title={ Distinct Computations Emerge From Compositional Curricula in In-Context Learning },
  author={ Jin Hwa Lee and Andrew K. Lampinen and Aaditya K. Singh and Andrew M. Saxe },
  journal={arXiv preprint arXiv:2506.13253},
  year={ 2025 }
}
Comments on this paper