ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2208.10734
27
6

Learning Dynamic Contextualised Word Embeddings via Template-based Temporal Adaptation

23 August 2022
Xiaohang Tang
Yi Zhou
Danushka Bollegala
ArXivPDFHTML
Abstract

Dynamic contextualised word embeddings (DCWEs) represent the temporal semantic variations of words. We propose a method for learning DCWEs by time-adapting a pretrained Masked Language Model (MLM) using time-sensitive templates. Given two snapshots C1C_1C1​ and C2C_2C2​ of a corpus taken respectively at two distinct timestamps T1T_1T1​ and T2T_2T2​, we first propose an unsupervised method to select (a) \emph{pivot} terms related to both C1C_1C1​ and C2C_2C2​, and (b) \emph{anchor} terms that are associated with a specific pivot term in each individual snapshot. We then generate prompts by filling manually compiled templates using the extracted pivot and anchor terms. Moreover, we propose an automatic method to learn time-sensitive templates from C1C_1C1​ and C2C_2C2​, without requiring any human supervision. Next, we use the generated prompts to adapt a pretrained MLM to T2T_2T2​ by fine-tuning using those prompts. Multiple experiments show that our proposed method reduces the perplexity of test sentences in C2C_2C2​, outperforming the current state-of-the-art.

View on arXiv
Comments on this paper