Explainable Artificial Intelligence (XAI) systems have garnered increasing attention across various domains, such as healthcare, finance, and robotics, where transparency and user trust are critical. Despite a lot of work done in the XAI field, most exist- ing methods fail to capture the dynamic nature of human-machine communication and have not been evaluated by human experiments. This thesis develops Dialogue Ex- planation Theory (DET), which incorporates interactive, conversational processes to deliver explanations, addressing how to achieve, improve and measure usersâ compre- hension according to their individual requirements and understanding levels. Grounded in integrating this theory with rule-based reasoning systems, this research not only en- riches the intended userâs understanding of effective explanations in rule-based AI sys- tems but also extends its application to interpretable machine learning models. This approach has been implemented in rule-based reasoning AI systems from scratch and conducted several real-user studies. The findings suggest that DET can improve usersâ understanding of the reasoning behind a conclusion by asking âWhy?â and âWhy not?â questions. It also gives machines the capacity to explain their actions, thereby enhanc- ing transparency and trust in such systems. This thesis underscores the importance of interaction in XAI and sets the groundwork for future explorations into user-focused AI explanations.
Date of Award | 6 Jan 2025 |
---|
Original language | English |
---|
Awarding Institution | - The University of Manchester
|
---|
Supervisor | Louise Dennis (Supervisor), Joe Collenette (Supervisor) & Clare Dixon (Supervisor) |
---|
- Artificial Intelligence
- Rule based
- Dialogue for Explanation
- Explainable Artificial Intelligence
- Machine Reasoning
EXPLANATION THROUGH DIALOGUE FOR RULE-BASED REASONING AI SYSTEMS
Xu, Y. (Author). 6 Jan 2025
Student thesis: Phd