ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.24012
28
0

Large Language Model Meets Constraint Propagation

29 May 2025
Alexandre Bonlarron
Florian Régin
Elisabetta De Maria
Jean-Charles Régin
ArXiv (abs)PDFHTML
Main:10 Pages
2 Figures
Bibliography:2 Pages
3 Tables
Abstract

Large Language Models (LLMs) excel at generating fluent text but struggle to enforce external constraints because they generate tokens sequentially without explicit control mechanisms. GenCP addresses this limitation by combining LLM predictions with Constraint Programming (CP) reasoning, formulating text generation as a Constraint Satisfaction Problem (CSP). In this paper, we improve GenCP by integrating Masked Language Models (MLMs) for domain generation, which allows bidirectional constraint propagation that leverages both past and future tokens. This integration bridges the gap between token-level prediction and structured constraint enforcement, leading to more reliable and constraint-aware text generation. Our evaluation on COLLIE benchmarks demonstrates that incorporating domain preview via MLM calls significantly improves GenCP's performance. Although this approach incurs additional MLM calls and, in some cases, increased backtracking, the overall effect is a more efficient use of LLM inferences and an enhanced ability to generate feasible and meaningful solutions, particularly in tasks with strict content constraints.

View on arXiv
@article{bonlarron2025_2505.24012,
  title={ Large Language Model Meets Constraint Propagation },
  author={ Alexandre Bonlarron and Florian Régin and Elisabetta De Maria and Jean-Charles Régin },
  journal={arXiv preprint arXiv:2505.24012},
  year={ 2025 }
}
Comments on this paper