ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.04667
38
0

An Information-theoretic Multi-task Representation Learning Framework for Natural Language Understanding

6 March 2025
Dou Hu
Lingwei Wei
Wei Zhou
Songlin Hu
ArXivPDFHTML
Abstract

This paper proposes a new principled multi-task representation learning framework (InfoMTL) to extract noise-invariant sufficient representations for all tasks. It ensures sufficiency of shared representations for all tasks and mitigates the negative effect of redundant features, which can enhance language understanding of pre-trained language models (PLMs) under the multi-task paradigm. Firstly, a shared information maximization principle is proposed to learn more sufficient shared representations for all target tasks. It can avoid the insufficiency issue arising from representation compression in the multi-task paradigm. Secondly, a task-specific information minimization principle is designed to mitigate the negative effect of potential redundant features in the input for each task. It can compress task-irrelevant redundant information and preserve necessary information relevant to the target for multi-task prediction. Experiments on six classification benchmarks show that our method outperforms 12 comparative multi-task methods under the same multi-task settings, especially in data-constrained and noisy scenarios. Extensive experiments demonstrate that the learned representations are more sufficient, data-efficient, and robust.

View on arXiv
@article{hu2025_2503.04667,
  title={ An Information-theoretic Multi-task Representation Learning Framework for Natural Language Understanding },
  author={ Dou Hu and Lingwei Wei and Wei Zhou and Songlin Hu },
  journal={arXiv preprint arXiv:2503.04667},
  year={ 2025 }
}
Comments on this paper