MKGL: Mastery of a Three-Word Language

Lingbing Guo, Zhongpu Bo, Zhuo Chen, Yichi Zhang, Jiaoyan Chen, Yarong Lan, Yangyifei Luo, Qian Li, Qiang Zhang, Wen Zhang, Huajun Chen

Research output: Chapter in Book/Conference proceedingConference contributionpeer-review

Abstract

Large language models (LLMs) have significantly advanced performance across a spectrum of natural language processing (NLP) tasks. Yet, their application to knowledge graphs (KGs), which describe facts in the form of triplets and allow minimal hallucinations, remains an underexplored frontier. In this paper, we investigate the integration of LLMs with KGs by introducing a specialized KG Language (KGL), where a sentence precisely consists of an entity noun, a relation verb, and ends with another entity noun. Despite KGL's unfamiliar vocabulary to the LLM, we facilitate its learning through a tailored dictionary and illustrative sentences, and enhance context understanding via real-time KG context retrieval and KGL token embedding augmentation. Our results reveal that LLMs can achieve fluency in KGL, drastically reducing errors compared to conventional KG embedding methods on KG completion. Furthermore, our enhanced LLM shows exceptional competence in generating accurate three-word sentences from an initial entity and interpreting new unseen terms out of KGs.
Original languageEnglish
Title of host publication38th Conference on Neural Information Processing Systems (NeurIPS 2024)
Publication statusAccepted/In press - 25 Sept 2024

Fingerprint

Dive into the research topics of 'MKGL: Mastery of a Three-Word Language'. Together they form a unique fingerprint.

Cite this