Description
Title: Vision based motion tracking of surgical instruments in 3D space
Speaker: Maj Stenmark, Computer Science and Region Skåne
When: 6 October at 12.00-13.15
Where: Online
AbstractMotion tracking during live surgeries may be used to assess surgeons’ intra-operative performance, provide feedback, and predict outcome. Current assessment protocols rely on human observations, controlled laboratory settings, or tracking technologies not suitable for live operating theatres. In this study, a novel method for motion tracking of live open-heart surgery was developed and evaluated.
In order to track and record the motion of the instruments we used a 3D-printed ‘tracking dies’ with miniature markers were fitted to DeBakey forceps. The surgical field was recorded with a video camera mounted above the operating table. Software was developed for tracking the die from the recordings. The system was tested on five open-heart procedures. Surgeons were asked to report subjective system related concerns during live surgery and assess the weight of the die on a blind test. The accuracy of the system was evaluated against ground truth data generated by a robot.
The vision-based motion tracking system was applicable for live surgeries with negligible inconvenience to the surgeons. Motion data was extracted with acceptable accuracy and speed at low computational cost.
Period | 2021 Oct 6 |
---|---|
Event type | Seminar |
Location | Lund, SwedenShow on map |
Related content
-
Projects
-
Lund University AI Research
Project: Network