EXPLANATION THROUGH DIALOGUE FOR RULE-BASED REASONING AI SYSTEMS

Student thesis: Phd

Abstract

Explainable Artificial Intelligence (XAI) systems have garnered increasing attention across various domains, such as healthcare, finance, and robotics, where transparency and user trust are critical. Despite a lot of work done in the XAI field, most exist- ing methods fail to capture the dynamic nature of human-machine communication and have not been evaluated by human experiments. This thesis develops Dialogue Ex- planation Theory (DET), which incorporates interactive, conversational processes to deliver explanations, addressing how to achieve, improve and measure users’ compre- hension according to their individual requirements and understanding levels. Grounded in integrating this theory with rule-based reasoning systems, this research not only en- riches the intended user’s understanding of effective explanations in rule-based AI sys- tems but also extends its application to interpretable machine learning models. This approach has been implemented in rule-based reasoning AI systems from scratch and conducted several real-user studies. The findings suggest that DET can improve users’ understanding of the reasoning behind a conclusion by asking “Why?” and “Why not?” questions. It also gives machines the capacity to explain their actions, thereby enhanc- ing transparency and trust in such systems. This thesis underscores the importance of interaction in XAI and sets the groundwork for future explorations into user-focused AI explanations.
Date of Award6 Jan 2025
Original languageEnglish
Awarding Institution
  • The University of Manchester
SupervisorLouise Dennis (Supervisor), Joe Collenette (Supervisor) & Clare Dixon (Supervisor)

Keywords

  • Artificial Intelligence
  • Rule based
  • Dialogue for Explanation
  • Explainable Artificial Intelligence
  • Machine Reasoning

Cite this

'