ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.08450
92
0

Towards Prompt Generalization: Grammar-aware Cross-Prompt Automated Essay Scoring

12 February 2025
Heejin Do
Taehee Park
Sangwon Ryu
Gary Geunbae Lee
ArXivPDFHTML
Abstract

In automated essay scoring (AES), recent efforts have shifted toward cross-prompt settings that score essays on unseen prompts for practical applicability. However, prior methods trained with essay-score pairs of specific prompts pose challenges in obtaining prompt-generalized essay representation. In this work, we propose a grammar-aware cross-prompt trait scoring (GAPS), which internally captures prompt-independent syntactic aspects to learn generic essay representation. We acquire grammatical error-corrected information in essays via the grammar error correction technique and design the AES model to seamlessly integrate such information. By internally referring to both the corrected and the original essays, the model can focus on generic features during training. Empirical experiments validate our method's generalizability, showing remarkable improvements in prompt-independent and grammar-related traits. Furthermore, GAPS achieves notable QWK gains in the most challenging cross-prompt scenario, highlighting its strength in evaluating unseen prompts.

View on arXiv
@article{do2025_2502.08450,
  title={ Towards Prompt Generalization: Grammar-aware Cross-Prompt Automated Essay Scoring },
  author={ Heejin Do and Taehee Park and Sangwon Ryu and Gary Geunbae Lee },
  journal={arXiv preprint arXiv:2502.08450},
  year={ 2025 }
}
Comments on this paper