One-shot to Weakly-Supervised Relation Classification using Language Models
Thy Thy Tran, Phong Le, Sophia Ananiadou.
TL;DR
We propose NoelA, an auto-encoder relation classifier using a noisy channel, to improve the accuracy by learning from the matching predictions.Relation classification aims at detecting a particular relation type between two entities in text, whose methods mostly requires annotated data. Data annotation is either a manual process for supervised learning, or automated, using knowledge bases for distant learning. Unfortunately, both annotation methodologies are costly and time-consuming since they depend on intensive human labour for annotation or for knowledge base creation. With recent evidence that language models capture some sort of relational facts as knowledge bases, one-shot relation classification using language models has been proposed via matching a given instance against examples. The only requirement is that each relation type is associated with an exemplar. However, the matching approach often yields incorrect predictions. In this work, we propose NoelA, an auto-encoder using a noisy channel, to improve the accuracy by learning from the matching predictions. NoelA outperforms BERT matching and a bootstrapping baseline on TACRED and reWiki80.
Citation
@inproceedings{ tran2021oneshot, title={One-shot to Weakly-Supervised Relation Classification using Language Models}, author={Thy Thy Tran and Phong Le and Sophia Ananiadou}, booktitle={3rd Conference on Automated Knowledge Base Construction}, year={2021}, url={https://openreview.net/forum?id=W0mr06PxTHp}, doi={10.24432/C5ZW29} }