
doi: 10.1002/rcs.70083
pmid: 40576152
ABSTRACTBackgroundAccurate detection of surgical instruments is essential for robot‐assisted surgery. Existing methods face challenges in both accuracy and real‐time performance, limiting their clinical applicability.MethodsWe propose UK‐YOLOv10, a novel framework that integrates two innovations: the uni‐fusion attention module (UFAM) for enhanced multi‐scale feature representation, and the C2fKAN module, which employs KAN convolution to improve classification accuracy and accelerate training.ResultsOn the M2CAI16‐Tool‐Locations dataset, UK‐YOLOv10 achieves detection accuracy of 96.7%, an mAP@0.5 of 96.4%, and an mAP@0.5:0.95 of 0.605, outperforming YOLOv10 by 3%, 2.2% and 3.6%, respectively. Generalisation on COCO2017 resulted in an mAP@0.5:0.95 of 0.386.ConclusionUK‐YOLOv10 significantly improves surgical instrument detection and demonstrates strong potential for robot‐assisted surgeries.
Deep Learning, Robotic Surgical Procedures, Humans, Surgical Instruments, Algorithms
Deep Learning, Robotic Surgical Procedures, Humans, Surgical Instruments, Algorithms
