Validation of the PISA 2015 collaborative problem-solving competence measure

Student thesis: Phd

Abstract

Collaborative problem solving (CPS) as a competence has received much attention in the educational literature, especially after the release of the Programme for International Student Assessment (PISA) 2015 results. In PISA 2015, 15-year-olds' competence to work in collaborative settings was assessed across countries. The validity of the PISA 2015 CPS competence measure has been repeatedly questioned, mainly due to the constraints imposed in the computer-based assessment. This thesis critically examines the validity of the CPS competence assessment as instrumentalised in the PISA 2015 study by analysing student responses and reflections on the PISA 2015 CPS items. A mixed methods approach was mobilised by linking analysis and results from research phases that use quantitative and qualitative methodologies. This thesis draws on the unified validity framework of Messick (1989) as well as the literature in CPS competence in education to investigate validity and seek interpretation of test scores. Two systematic literature reviews focused on the conceptualisation and operationalisation of CPS and the assessment of CPS competence. The first empirical phase of the thesis involved the use of the Rasch measurement framework to analyse the PISA 2015 dataset for England (a secondary data analysis). Analysing the available secondary data, largely unused to date, this phase examined a validation based on the multidimensional character of CPS competence. As a next step, the constructed CPS competence measures were used as variables in further statistical analyses to evaluate external and consequential aspects of validity. The second empirical phase of the thesis involved primary data collection through cognitive interviews and verbal probing with students from a secondary school in England. Using the released PISA 2015 CPS assessment task in new ways (cognitive interviewing), this phase adds to what PISA/OECD have already published/reported. Results suggest that: a) the identification of student response processes revealed limitations to the validity of the CPS task items used, b) the associations of CPS with theoretically relevant variables did not provide sufficient evidence to support the external and structural validity aspects of the CPS competence measures, and c) several weaknesses were identified in the instrument, the PISA methodology and reporting, which eventually undermined its external and consequential validity. The thesis concludes that data derived from the PISA 2015 CPS competence assessment should be treated with caution, suggesting that test score interpretation should recognise that the assessment only reflects student CPS competence when working with computer-simulated partners in a restricted assessment environment. The study's implications highlight the importance of considering evaluation in real-life situations and provide insight into how the use of such instruments in high stakes testing environments might contribute to the implementation of standardised curricula. Overall, the present study stands as an independent validation of the PISA 2015 CPS competence assessment, identifying threats to validity that weaken extrapolation from the CPS competence assessment to real-world collaboration situations.
Date of Award1 Aug 2023
Original languageEnglish
Awarding Institution
  • The University of Manchester
SupervisorJulian Williams (Supervisor), Maria Pampaka (Supervisor) & Alexandru Cernat (Supervisor)

Keywords

  • item response theory
  • conceptualisation
  • cognitive interviewing
  • Rasch model
  • systematic literature review
  • response processes
  • measurement
  • collaborative problem solving
  • PISA
  • student outcomes
  • assessment
  • validity

Cite this

'