Memory-Aware Attentive Control for Community Question Answering With Knowledge-Based Dual Refinement

Jinmeng Wu, Tingting Mu, Jeyan Thiyagalingam, John Y. Goulermas

Research output: Contribution to journalArticlepeer-review

8 Downloads (Pure)


The question answering system in open domain enables a machine to automatically select and generate the answer for questions posed by humans in a natural language form on the website. Previous approaches seek effective ways of extracting the semantic features between question and answer, but the contextual information effects in semantic matching are still limited by short-term memory. As an alternative, we propose an internal knowledge-based end-to-end model, enhanced by an attentive memory network for both answer selection and answer generation tasks by considering the full advantages of the semantics and multifacts (i.e., timescales, topics, and context). In detail, we design a long-term memory to learn the top- k fine-grained similarity representations, where two memory-aware mechanisms aggregate the series of semantic word-level and sentence-level similarities to support the coarse contextual information. Furthermore, we propose a novel memory refinement mechanism with the two-dimensional of writing heads that offer an efficient approach to multiview selection of the salient word pairs. In the training stage, we adopt the transformer-based transfer learning skill to effectively pretrain the model. Experimentally, we compare the state-of-the-art approaches on four public datasets, the experimental results show that the proposed model achieves competitive performance.
Original languageEnglish
Pages (from-to)1-14
JournalIEEE Transactions on Systems, Man, and Cybernetics: Systems
Early online date24 Jan 2023
Publication statusE-pub ahead of print - 24 Jan 2023


Dive into the research topics of 'Memory-Aware Attentive Control for Community Question Answering With Knowledge-Based Dual Refinement'. Together they form a unique fingerprint.

Cite this