Reasons-Responsive Machine Compatibilism: A New Pathway for Analysis of Autonomous Systems and Moral Responsibility Gaps

  • Sarah Christensen

Student thesis: Phd

Abstract

Matthias (2004) argues that the use of autonomous systems leads to 'moral responsibility gaps': cases involving the outputs of learning autonomous systems where seemingly no human agent is responsible. In this thesis, I investigate the notion of morally responsible autonomous system in order to propose a solution to moral responsibility gap problems. I argue that contemporary and near-future autonomous systems can be considered manipulated reasons-responsive entities, and further I conclude that one can trace the moral responsibility for autonomous systems' outputs and their immediate consequences back to their users. The thesis centres on three pivotal ideas. The first is to treat non-malfunctioning autonomous systems as potential moral agents, as opposed to mere tools. This change in perspective allows one to raise questions about the potential status of autonomous systems, both current and future, as morally responsible entities. By adopting this perspective, one can investigate what conditions for moral responsibility autonomous systems might be able to fulfil - and which conditions they do not. The second is the concept of Machine Compatibilism. This thesis shows that current literature rejects the very idea of morally responsible autonomous systems by assuming that moral responsibility is incompatible with the systems' determined nature. Machine Compatibilism is introduced in response to this. The thesis further develops a machine-focused version of Fischer and Ravizza's (1998) reasons-responsive compatibilism as an example of a promising machine compatibilist account. The third is the identification of current and near-future autonomous systems as manipulated reasons-responsive entities. Together with the machine compatibilist account developed earlier on, this provides a framework for analysing moral responsibility gap problems, and for bridging the moral responsibility gap.
Date of Award1 Aug 2023
Original languageEnglish
Awarding Institution
  • The University of Manchester
SupervisorAnn Whittle (Supervisor)

Keywords

  • Philosophy of Artificial Intelligence
  • Philosophy
  • Machine Ethics
  • Machine Compatibilism
  • Responsibility Gap

Cite this

'