TY - JOUR
T1 - Objective Analysis of Neck Muscle Boundaries for Cervical Dystonia Using Ultrasound Imaging and Deep Learning
AU - Loram, Ian
AU - Siddique, Abdul
AU - Sanchez, Maria B
AU - Harding, Pete
AU - Silverdale, Monty
AU - Kobylecki, Christopher
AU - Cunningham, Ryan
PY - 2020/4/9
Y1 - 2020/4/9
N2 - OBJECTIVE: To provide objective visualization and pattern analysis of neck muscle boundaries to inform and monitor treatment of cervical dystonia.METHODS: We recorded transverse cervical ultrasound (US) images and whole-body motion analysis of sixty-one standing participants (35 cervical dystonia, 26 age matched controls). We manually annotated 3,272 US images sampling posture and the functional range of pitch, yaw, and roll head movements. Using previously validated methods, we used 60-fold cross validation to train, validate and test a deep neural network (U-net) to classify pixels to 13 categories (five paired neck muscles, skin, ligamentum nuchae, vertebra). For all participants for their normal standing posture, we segmented US images and classified condition (Dystonia/Control), sex and age (higher/lower) from segment boundaries. We performed an explanatory, visualization analysis of dystonia muscle-boundaries.RESULTS: For all segments, agreement with manual labels was Dice Coefficient (64 ± 21%) and Hausdorff Distance (5.7 ± 4 mm). For deep muscle layers, boundaries predicted central injection sites with average precision 94 ± 3%. Using leave-one-out cross-validation, a support-vector-machine classified condition, sex, and age from predicted muscle boundaries at accuracy 70.5%, 67.2%, 52.4% respectively, exceeding classification by manual labels. From muscle boundaries, Dystonia clustered optimally into three sub-groups. These sub-groups are visualized and explained by three eigen-patterns which correlate significantly with truncal and head posture.CONCLUSION: Using US, neck muscle shape alone discriminates dystonia from healthy controls.SIGNIFICANCE: Using deep learning, US imaging allows online, automated visualization, and diagnostic analysis of cervical dystonia and segmentation of individual muscles for targeted injection.
AB - OBJECTIVE: To provide objective visualization and pattern analysis of neck muscle boundaries to inform and monitor treatment of cervical dystonia.METHODS: We recorded transverse cervical ultrasound (US) images and whole-body motion analysis of sixty-one standing participants (35 cervical dystonia, 26 age matched controls). We manually annotated 3,272 US images sampling posture and the functional range of pitch, yaw, and roll head movements. Using previously validated methods, we used 60-fold cross validation to train, validate and test a deep neural network (U-net) to classify pixels to 13 categories (five paired neck muscles, skin, ligamentum nuchae, vertebra). For all participants for their normal standing posture, we segmented US images and classified condition (Dystonia/Control), sex and age (higher/lower) from segment boundaries. We performed an explanatory, visualization analysis of dystonia muscle-boundaries.RESULTS: For all segments, agreement with manual labels was Dice Coefficient (64 ± 21%) and Hausdorff Distance (5.7 ± 4 mm). For deep muscle layers, boundaries predicted central injection sites with average precision 94 ± 3%. Using leave-one-out cross-validation, a support-vector-machine classified condition, sex, and age from predicted muscle boundaries at accuracy 70.5%, 67.2%, 52.4% respectively, exceeding classification by manual labels. From muscle boundaries, Dystonia clustered optimally into three sub-groups. These sub-groups are visualized and explained by three eigen-patterns which correlate significantly with truncal and head posture.CONCLUSION: Using US, neck muscle shape alone discriminates dystonia from healthy controls.SIGNIFICANCE: Using deep learning, US imaging allows online, automated visualization, and diagnostic analysis of cervical dystonia and segmentation of individual muscles for targeted injection.
U2 - 10.1109/JBHI.2020.2964098
DO - 10.1109/JBHI.2020.2964098
M3 - Article
C2 - 31940567
SN - 2168-2194
VL - 24
SP - 1016
EP - 1027
JO - IEEE Journal of Biomedical and Health Informatics
JF - IEEE Journal of Biomedical and Health Informatics
IS - 4
ER -