
Abstract Purpose Video-based intra-abdominal instrument tracking for laparoscopic surgeries is a common research area. However, the tracking can only be done with instruments that are actually visible in the laparoscopic image. By using extra-abdominal cameras to detect trocars and classify their occupancy state, additional information about the instrument location, whether an instrument is still in the abdomen or not, can be obtained. This can enhance laparoscopic workflow understanding and enrich already existing intra-abdominal solutions. Methods A data set of four laparoscopic surgeries recorded with two time-synchronized extra-abdominal 2D cameras was generated. The preprocessed and annotated data were used to train a deep learning-based network architecture consisting of a trocar detection, a centroid tracker and a temporal model to provide the occupancy state of all trocars during the surgery. Results The trocar detection model achieves an F1 score of $$95.06\pm 0.88\%$$ 95.06 ± 0.88 % . The prediction of the occupancy state yields an F1 score of $$89.29\pm 5.29\%$$ 89.29 ± 5.29 % , providing a first step towards enhanced surgical workflow understanding. Conclusion The current method shows promising results for the extra-abdominal tracking of trocars and their occupancy state. Future advancements include the enlargement of the data set and incorporation of intra-abdominal imaging to facilitate accurate assignment of instruments to trocars.
Short Communication, Video Recording, Surgical Instruments, Workflow, Deep Learning, Surgical Instruments [MeSH] ; Laparoscopy/instrumentation [MeSH] ; Laparoscopy/methods [MeSH] ; Workflow [MeSH] ; Deep Learning [MeSH] ; Humans [MeSH] ; Surgical workflow analysis ; Surgical context awareness ; Trocar detection ; Video Recording/instrumentation [MeSH] ; Deep learning ; Short Communication, Humans, Laparoscopy, Short Communication ; Trocar detection ; Surgical workflow analysis ; Surgical context awareness ; Deep learning, ddc: ddc:
Short Communication, Video Recording, Surgical Instruments, Workflow, Deep Learning, Surgical Instruments [MeSH] ; Laparoscopy/instrumentation [MeSH] ; Laparoscopy/methods [MeSH] ; Workflow [MeSH] ; Deep Learning [MeSH] ; Humans [MeSH] ; Surgical workflow analysis ; Surgical context awareness ; Trocar detection ; Video Recording/instrumentation [MeSH] ; Deep learning ; Short Communication, Humans, Laparoscopy, Short Communication ; Trocar detection ; Surgical workflow analysis ; Surgical context awareness ; Deep learning, ddc: ddc:
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 5 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Top 10% | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Top 10% |
