ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.04855
91
0

Prompting LLMs: Length Control for Isometric Machine Translation

5 June 2025
Dávid Javorský
Ondrej Bojar
François Yvon
ArXiv (abs)PDFHTML
Abstract

In this study, we explore the effectiveness of isometric machine translation across multiple language pairs (En→\to→De, En→\to→Fr, and En→\to→Es) under the conditions of the IWSLT Isometric Shared Task 2022. Using eight open-source large language models (LLMs) of varying sizes, we investigate how different prompting strategies, varying numbers of few-shot examples, and demonstration selection influence translation quality and length control. We discover that the phrasing of instructions, when aligned with the properties of the provided demonstrations, plays a crucial role in controlling the output length. Our experiments show that LLMs tend to produce shorter translations only when presented with extreme examples, while isometric demonstrations often lead to the models disregarding length constraints. While few-shot prompting generally enhances translation quality, further improvements are marginal across 5, 10, and 20-shot settings. Finally, considering multiple outputs allows to notably improve overall tradeoff between the length and quality, yielding state-of-the-art performance for some language pairs.

View on arXiv
@article{javorský2025_2506.04855,
  title={ Prompting LLMs: Length Control for Isometric Machine Translation },
  author={ Dávid Javorský and Ondřej Bojar and François Yvon },
  journal={arXiv preprint arXiv:2506.04855},
  year={ 2025 }
}
Comments on this paper