A Comparative Study of Spatio-Temporal U-Nets for Tissue Segmentation in Surgical Robotics

Aleks Attanasio*, Chiara Alberti, Bruno Scaglioni, Nils Marahrens, Alejandro F. Frangi, Matteo Leonetti, Chandra Shekhar Biyani, Elena De Momi, Pietro Valdastri

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

In surgical robotics, the ability to achieve high levels of autonomy is often limited by the complexity of the surgical scene. Autonomous interaction with soft tissues requires machines able to examine and understand the endoscopic video streams in real-time and identify the features of interest. In this work, we show the first example of spatio-temporal neural networks, based on the U-Net, aimed at segmenting soft tissues in endoscopic images. The networks, equipped with Long Short-Term Memory and Attention Gate cells, can extract the correlation between consecutive frames in an endoscopic video stream, thus enhancing the segmentation's accuracy with respect to the standard U-Net. Initially, three configurations of the spatio-temporal layers are compared to select the best architecture. Afterwards, the parameters of the network are optimised and finally the results are compared with the standard U-Net. An accuracy of 83.77% ± 2.18% and a precision of 78.42% ± 7.38% are achieved by implementing both Long Short Term Memory (LSTM) convolutional layers and Attention Gate blocks. The results, although originated in the context of surgical tissue retraction, could benefit many autonomous tasks such as ablation, suturing and debridement.

Original languageEnglish
Article number9335948
Pages (from-to)53-63
Number of pages11
JournalIEEE Transactions on Medical Robotics and Bionics
Volume3
Issue number1
DOIs
Publication statusPublished - Feb 2021

Keywords

  • computer assisted interventions
  • Medical robotics
  • minimally invasive surgery
  • surgical vision

Fingerprint

Dive into the research topics of 'A Comparative Study of Spatio-Temporal U-Nets for Tissue Segmentation in Surgical Robotics'. Together they form a unique fingerprint.

Cite this