ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.03129
60
0

Teaching Large Language Models Number-Focused Headline Generation With Key Element Rationales

5 February 2025
Zhen Qian
Xiuzhen Zhang
Xiaofei Xu
Xiwei Xu
    LRM
ArXiv (abs)PDFHTML
Abstract

Number-focused headline generation is a summarization task requiring both high textual quality and precise numerical accuracy, which poses a unique challenge for Large Language Models (LLMs). Existing studies in the literature focus only on either textual quality or numerical reasoning and thus are inadequate to address this challenge. In this paper, we propose a novel chain-of-thought framework for using rationales comprising key elements of the Topic, Entities, and Numerical reasoning (TEN) in news articles to enhance the capability for LLMs to generate topic-aligned high-quality texts with precise numerical accuracy. Specifically, a teacher LLM is employed to generate TEN rationales as supervision data, which are then used to teach and fine-tune a student LLM. Our approach teaches the student LLM automatic generation of rationales with enhanced capability for numerical reasoning and topic-aligned numerical headline generation. Experiments show that our approach achieves superior performance in both textual quality and numerical accuracy.

View on arXiv
@article{qian2025_2502.03129,
  title={ Teaching Large Language Models Number-Focused Headline Generation With Key Element Rationales },
  author={ Zhen Qian and Xiuzhen Zhang and Xiaofei Xu and Feng Xia },
  journal={arXiv preprint arXiv:2502.03129},
  year={ 2025 }
}
Comments on this paper