Bridging to Action Requires Mixed Methods, Not Only Randomised Control Trials

Research output: Contribution to journalArticlepeer-review

100 Downloads (Pure)

Abstract

Development evaluation refers to evaluating projects and programmes in development contexts. Some evaluations are too narrow. Narrow within-discipline impact evaluations are weaker than multidisciplinary, mixed methods evaluations. A two-step process leads toward profoundly better arguments in assessing the impact of a development intervention. The first step is setting out the arena for discussion, including what the various entities are in the social, political, cultural and natural environment surrounding the chosen problem. The second step is that once this arena has been declared, the project and triangulation data can be brought to bear upon logical arguments with clear, transparent reasoning leading to a set of conclusions. In this second step we do need scientific methods such as peer review, data and so on. But crucially, the impact evaluation process must not rest upon a single data type, such as survey data. It is dangerous and undesirable to have the entire validity of the conclusions resting upon randomised control trials, or even a mixture of data types. Different contributions to knowledge exist within the evaluation process, including the interaction of people during action research, ethnography, case-study methods, process tracing and qualitative methods. The cement holding my argument together is that multiple logics are used (retroductive, deductive, and inductive in particular). Deductive mathematics should not dominate the evaluation of an intervention, as randomised controlled trials on their own lend themselves to worrying fallacies about causality. I show this using Boolean fuzzy set logic. An indicator of high quality development evaluation is the use of multiple logics in a transparent way.
Original languageEnglish
Pages (from-to)139-162
Number of pages24
JournalEuropean Journal of Development Research
Volume31
Issue number2
Early online date25 Mar 2019
DOIs
Publication statusPublished - 10 Apr 2019

Keywords

  • Comparative case-study research
  • Evaluation
  • Impact evaluation
  • Methodology
  • Mixed-methods
  • Randomised control trials
  • Retroduction

Research Beacons, Institutes and Platforms

  • Cathie Marsh Institute
  • Work and Equalities Institute

Fingerprint

Dive into the research topics of 'Bridging to Action Requires Mixed Methods, Not Only Randomised Control Trials'. Together they form a unique fingerprint.

Cite this