ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2510.24794
  4. Cited By
MR-Align: Meta-Reasoning Informed Factuality Alignment for Large Reasoning Models

MR-Align: Meta-Reasoning Informed Factuality Alignment for Large Reasoning Models

27 October 2025
Xinming Wang
Jian Xu
Bin Yu
Sheng Lian
Hongzhu Yi
Y. Chen
Yingjian Zhu
Boran Wang
Hongming Yang
Han Hu
Xu-Yao Zhang
Cheng-Lin Liu
    HILMLRM
ArXiv (abs)PDFHTMLHuggingFace (6 upvotes)

Papers citing "MR-Align: Meta-Reasoning Informed Factuality Alignment for Large Reasoning Models"

0 / 0 papers shown
Title

No papers found