397

LightNER: A Lightweight Generative Framework with Prompt-guided Attention for Low-resource NER

International Conference on Computational Linguistics (COLING), 2021
Ningyu Zhang
Lei Li
Xin Xie
Chuanqi Tan
Luo Si
Huajun Chen
Main:8 Pages
5 Figures
Bibliography:4 Pages
8 Tables
Appendix:2 Pages
Abstract

NER in low-resource languages or domains suffers from inadequate training data. Existing transfer learning approaches for low-resource NER usually have the challenge that the target domain has different label sets compared with a resource-rich source domain, which can be concluded as class transfer and domain transfer problems. In this paper, we propose a lightweight generative framework with prompt-guided attention for low-resource NER (LightNER) to address these issues. Concretely, instead of tackling the problem by training label-specific discriminative classifiers, we convert sequence labeling to generate the entity pointer index sequence and entity categories without any label-specific classifiers, which can address the class transfer issue. We further propose prompt-guided attention by incorporating continuous prompts into the self-attention layer to re-modulate the attention and adapt pre-trained weights. Note that we only tune those continuous prompts with the whole parameter of the pre-trained language model fixed, thus, making our approach lightweight and flexible for low-resource scenarios and can better transfer knowledge across domains. Experimental results show that by tuning only 0.16% of the parameters, LightNER can obtain comparable performance in the standard setting and outperform standard sequence labeling and prototype-based methods in low-resource settings.

View on arXiv
Comments on this paper