Combining Analogy with Language Models for Knowledge Extraction

Danilo Neves RibeiroKenneth Forbus.

doi:10.24432/C5KK5X

TL;DR

Combines language models with analogical learning for extracting common sense facts from web text from a few examples.
Learning structured knowledge from natural language text has been a long-standing challenge. Previous work has focused on specific domains, mostly extracting knowledge about named entities (e.g. countries, companies, or persons) instead of general-purpose world knowledge (e.g. information about science or everyday objects). In this paper we combine the Companion Cognitive Architecture with the BERT Language Model to extract structured knowledge from text, with the goal of automatically inferring missing commonsense facts from an existing knowledge base. Using the principles of distant supervision, the system learns functions called query cases that map statements expressed in natural language into knowledge base relations. Afterwards, the system uses such query cases to extract structured knowledge using analogical reasoning. We run experiments on 2,679 Simple English Wikipedia articles, where the system is able to learn high precision facts about a variety of subjects from a few training examples, outperforming strong baselines.

Citation

@inproceedings{
ribeiro2021combining,
title={Combining Analogy with Language Models for Knowledge Extraction},
author={Danilo Neves Ribeiro and Kenneth Forbus},
booktitle={3rd Conference on Automated Knowledge Base Construction},
year={2021},
url={https://openreview.net/forum?id=4TpJpZ-_gyl},
doi={10.24432/C5KK5X}
}