Automatic annotation of surgical activities using virtual reality environments - Université de Rennes Accéder directement au contenu
Article Dans Une Revue International Journal of Computer Assisted Radiology and Surgery Année : 2019

Automatic annotation of surgical activities using virtual reality environments

Résumé

Purpose - Annotation of surgical activities becomes increasingly important for many recent applications such as surgical workflow analysis, surgical situation awareness, and the design of the operating room of the future, especially to train machine learning methods in order to develop intelligent assistance. Currently, annotation is mostly performed by observers with medical background and is incredibly costly and time-consuming, creating a major bottleneck for the above-mentioned technologies. In this paper, we propose a way to eliminate, or at least limit, the human intervention in the annotation process. Methods - Meaningful information about interaction between objects is inherently available in virtual reality environments. We propose a strategy to convert automatically this information into annotations in order to provide as output individual surgical process models. Validation - We implemented our approach through a peg-transfer task simulator and compared it to manual annotations. To assess the impact of our contribution, we studied both intra- and inter-observer variability. Results and conclusion - In average, manual annotations took more than 12 min for 1 min of video to achieve low-level physical activity annotation, whereas automatic annotation is achieved in less than a second for the same video period. We also demonstrated that manual annotation introduced mistakes as well as intra- and inter-observer variability that our method is able to suppress due to the high precision and reproducibility.
Fichier principal
Vignette du fichier
IJCARS_2019.pdf (526.72 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02178714 , version 1 (10-07-2019)

Identifiants

Citer

Arnaud Huaulmé, Fabien Despinoy, Saul Alexis Heredia Perez, Kanako Harada, Mamoru Mitsuishi, et al.. Automatic annotation of surgical activities using virtual reality environments. International Journal of Computer Assisted Radiology and Surgery, 2019, 14 (10), pp.1663-1671. ⟨10.1007/s11548-019-02008-x⟩. ⟨hal-02178714⟩
166 Consultations
540 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More