19
7

Relation Extraction with Weighted Contrastive Pre-training on Distant Supervision

Zhen Wan
Fei Cheng
Qianying Liu
Zhuoyuan Mao
Haiyue Song
Sadao Kurohashi
Abstract

Contrastive pre-training on distant supervision has shown remarkable effectiveness in improving supervised relation extraction tasks. However, the existing methods ignore the intrinsic noise of distant supervision during the pre-training stage. In this paper, we propose a weighted contrastive learning method by leveraging the supervised data to estimate the reliability of pre-training instances and explicitly reduce the effect of noise. Experimental results on three supervised datasets demonstrate the advantages of our proposed weighted contrastive learning approach compared to two state-of-the-art non-weighted baselines.Our code and models are available at: https://github.com/YukinoWan/WCL

View on arXiv
Comments on this paper