ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.05887
59
1

MTPChat: A Multimodal Time-Aware Persona Dataset for Conversational Agents

9 February 2025
Wanqi Yang
Y. Li
Meng Fang
L. Chen
ArXivPDFHTML
Abstract

Understanding temporal dynamics is critical for conversational agents, enabling effective content analysis and informed decision-making. However, time-aware datasets, particularly for persona-grounded conversations, are still limited, which narrows their scope and diminishes their complexity. To address this gap, we introduce MTPChat, a multimodal, time-aware persona dialogue dataset that integrates linguistic, visual, and temporal elements within dialogue and persona memory. Leveraging MTPChat, we propose two time-sensitive tasks: Temporal Next Response Prediction (TNRP) and Temporal Grounding Memory Prediction (TGMP), both designed to assess a model's ability to understand implicit temporal cues and dynamic interactions. Additionally, we present an innovative framework featuring an adaptive temporal module to effectively integrate multimodal streams and capture temporal dependencies. Experimental results validate the challenges posed by MTPChat and demonstrate the effectiveness of our framework in multimodal time-sensitive scenarios.

View on arXiv
@article{yang2025_2502.05887,
  title={ MTPChat: A Multimodal Time-Aware Persona Dataset for Conversational Agents },
  author={ Wanqi Yang and Yanda Li and Meng Fang and Ling Chen },
  journal={arXiv preprint arXiv:2502.05887},
  year={ 2025 }
}
Comments on this paper