TY - JOUR
T1 - What is the inter-rater agreement of injury classification using the WHO minimum data set for emergency medical teams?
AU - Jafar, Anisa
AU - Sergeant, Jamie
AU - Lecky, Fiona
PY - 2020/1/7
Y1 - 2020/1/7
N2 - Background
In 2017 the WHO produced its first minimum data set (MDS) for emergency medical team (EMT) daily reporting during sudden onset disasters (SODs), following expert consensus. The MDS was deliberately designed to be simple in order to improve the rate of data capture however it is new and untested. This study assesses the inter-rater agreement between practitioners when performing the injury aspect of coding within the WHO EMT MDS.
Methods
25 clinical case vignettes were developed, reflecting potential injuries encountered in a SOD. These were presented online from April to July 2018 to practitioners who have experience of/training in managing patients in SODs The practitioners were from UK-Med’s members, AUSMAT’s Northern Territory members and NZMAT members. Practitioners were asked to code injuries according to the WHO EMT MDS case classifications. Randolph’s kappa statistic for free-marginal multi-rater data was calculated for the whole data-set as well as subgroups to ascertain inter-rater agreement
Results
86 practitioners responded (20.6% response rate), giving >2000 individual case responses. Overall agreement was moderate at 67.9% with a kappa of 0.59 [CI 0.49,0.69]. Despite subgroups of paramedics (kappa 0.63 [CI 0.53,0.72]), doctors (kappa 0.61 [CI 0.52, 0.69]) and those with disaster experience (kappa 0.62 [CI 0.52, 0.71]) suggesting slightly higher agreement, their CIs (and those of other subgroups) suggest overall similar and moderate levels of practitioner agreement in classifying injuries according to the MDS categories.
Conclusions
An inter-rater agreement of 0.59 is moderate, at best, however it gives MoHs some sense of how tightly they may interpret injury data derived from daily reports using the WHO EMT MDS. Furthermore, this kappa is similar to established but more complex (thus more contextually impractial) injury scores. Similar studies, with weighting for injury likelihood using sample data from SODs would further refine the level of expected inter-rater agreement.
AB - Background
In 2017 the WHO produced its first minimum data set (MDS) for emergency medical team (EMT) daily reporting during sudden onset disasters (SODs), following expert consensus. The MDS was deliberately designed to be simple in order to improve the rate of data capture however it is new and untested. This study assesses the inter-rater agreement between practitioners when performing the injury aspect of coding within the WHO EMT MDS.
Methods
25 clinical case vignettes were developed, reflecting potential injuries encountered in a SOD. These were presented online from April to July 2018 to practitioners who have experience of/training in managing patients in SODs The practitioners were from UK-Med’s members, AUSMAT’s Northern Territory members and NZMAT members. Practitioners were asked to code injuries according to the WHO EMT MDS case classifications. Randolph’s kappa statistic for free-marginal multi-rater data was calculated for the whole data-set as well as subgroups to ascertain inter-rater agreement
Results
86 practitioners responded (20.6% response rate), giving >2000 individual case responses. Overall agreement was moderate at 67.9% with a kappa of 0.59 [CI 0.49,0.69]. Despite subgroups of paramedics (kappa 0.63 [CI 0.53,0.72]), doctors (kappa 0.61 [CI 0.52, 0.69]) and those with disaster experience (kappa 0.62 [CI 0.52, 0.71]) suggesting slightly higher agreement, their CIs (and those of other subgroups) suggest overall similar and moderate levels of practitioner agreement in classifying injuries according to the MDS categories.
Conclusions
An inter-rater agreement of 0.59 is moderate, at best, however it gives MoHs some sense of how tightly they may interpret injury data derived from daily reports using the WHO EMT MDS. Furthermore, this kappa is similar to established but more complex (thus more contextually impractial) injury scores. Similar studies, with weighting for injury likelihood using sample data from SODs would further refine the level of expected inter-rater agreement.
KW - data management
KW - disaster planning and response
KW - global health
U2 - 10.1136/emermed-2019-209012
DO - 10.1136/emermed-2019-209012
M3 - Article
SN - 1472-0205
VL - 37
SP - 58
EP - 64
JO - Emergency Medicine Journal
JF - Emergency Medicine Journal
IS - 2
ER -