
pmid: 38613819
Objective: To construct a convolutional neural network (CNN) model that can recognize and delineate anatomic structures on intraoperative video frames of robot-assisted radical prostatectomy (RARP) and to use these annotations to predict the surgical urethral length (SUL). Background: Urethral dissection during RARP impacts patient urinary incontinence (UI) outcomes, and requires extensive training. Large differences exist between incontinence outcomes of different urologists and hospitals. Also, surgeon experience and education are critical toward optimal outcomes. Therefore, new approaches are warranted. SUL is associated with UI. Artificial intelligence (AI) surgical image segmentation using a CNN could automate SUL estimation and contribute toward future AI-assisted RARP and surgeon guidance. Methods: Eighty-eight intraoperative RARP videos between June 2009 and September 2014 were collected from a single center. Two hundred sixty-four frames were annotated according to prostate, urethra, ligated plexus, and catheter. Thirty annotated images from different RARP videos were used as a test data set. The dice (similarity) coefficient (DSC) and 95th percentile Hausdorff distance (Hd95) were used to determine model performance. SUL was calculated using the catheter as a reference. Results: The DSC of the best performing model were 0.735 and 0.755 for the catheter and urethra classes, respectively, with a Hd95 of 29.27 and 72.62, respectively. The model performed moderately on the ligated plexus and prostate. The predicted SUL showed a mean difference of 0.64 to 1.86 mm difference vs human annotators, but with significant deviation (standard deviation = 3.28-3.56). Conclusion: This study shows that an AI image segmentation model can predict vital structures during RARP urethral dissection with moderate to fair accuracy. SUL estimation derived from it showed large deviations and outliers when compared with human annotators, but with a small mean difference (<2 mm). This is a promising development for further research on AI-assisted RARP.
Male, Neural Networks, Urethra/surgery, Urology, Image Processing, SDG 3 – Goede gezondheid en welzijn, Computer, SDG 3 - Good Health and Well-being, Urethra, Robotic Surgical Procedures, Artificial Intelligence, Computer-Assisted/methods, Image Processing, Computer-Assisted, Humans, Prostatectomy, Prostate/surgery, continence, Robotic Surgical Procedures/methods, Prostate, artificial intelligence, prostate cancer, anatomy recognition, Prostatectomy/methods, urethral length, Neural Networks, Computer
Male, Neural Networks, Urethra/surgery, Urology, Image Processing, SDG 3 – Goede gezondheid en welzijn, Computer, SDG 3 - Good Health and Well-being, Urethra, Robotic Surgical Procedures, Artificial Intelligence, Computer-Assisted/methods, Image Processing, Computer-Assisted, Humans, Prostatectomy, Prostate/surgery, continence, Robotic Surgical Procedures/methods, Prostate, artificial intelligence, prostate cancer, anatomy recognition, Prostatectomy/methods, urethral length, Neural Networks, Computer
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 4 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Top 10% | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
