TY - JOUR
T1 - A deep learning model based on fusion images of chest radiography and X-ray sponge images supports human visual characteristics of retained surgical items detection
AU - Kawakubo, Masateru
AU - Waki, Hiroto
AU - Shirasaka, Takashi
AU - Kojima, Tsukasa
AU - Mikayama, Ryoji
AU - Hamasaki, Hiroshi
AU - Akamine, Hiroshi
AU - Kato, Toyoyuki
AU - Baba, Shingo
AU - Ushiro, Shin
AU - Ishigami, Kousei
N1 - Funding Information:
This work was supported by the Japanese Grant of The Clinical Research Promotion Foundation (2020).
Publisher Copyright:
© 2022, CARS.
PY - 2022
Y1 - 2022
N2 - Purpose: Although a novel deep learning software was proposed using post-processed images obtained by the fusion between X-ray images of normal post-operative radiography and surgical sponge, the association of the retained surgical item detectability with human visual evaluation has not been sufficiently examined. In this study, we investigated the association of retained surgical item detectability between deep learning and human subjective evaluation. Methods: A deep learning model was constructed from 2987 training images and 1298 validation images, which were obtained from post-processing of the image fusion between X-ray images of normal post-operative radiography and surgical sponge. Then, another 800 images were used, i.e., 400 with and 400 without surgical sponge. The detection characteristics of retained sponges between the model and a general observer with 10-year clinical experience were analyzed using the receiver operator characteristics. Results: The following values from the deep learning model and observer were, respectively, derived: Cutoff values of probability were 0.37 and 0.45; areas under the curves were 0.87 and 0.76; sensitivity values were 85% and 61%; and specificity values were 73% and 92%. Conclusion: For the detection of surgical sponges, we concluded that the deep learning model has higher sensitivity, while the human observer has higher specificity. These characteristics indicate that the deep learning system that is complementary to humans could support the clinical workflow in operation rooms for prevention of retained surgical items.
AB - Purpose: Although a novel deep learning software was proposed using post-processed images obtained by the fusion between X-ray images of normal post-operative radiography and surgical sponge, the association of the retained surgical item detectability with human visual evaluation has not been sufficiently examined. In this study, we investigated the association of retained surgical item detectability between deep learning and human subjective evaluation. Methods: A deep learning model was constructed from 2987 training images and 1298 validation images, which were obtained from post-processing of the image fusion between X-ray images of normal post-operative radiography and surgical sponge. Then, another 800 images were used, i.e., 400 with and 400 without surgical sponge. The detection characteristics of retained sponges between the model and a general observer with 10-year clinical experience were analyzed using the receiver operator characteristics. Results: The following values from the deep learning model and observer were, respectively, derived: Cutoff values of probability were 0.37 and 0.45; areas under the curves were 0.87 and 0.76; sensitivity values were 85% and 61%; and specificity values were 73% and 92%. Conclusion: For the detection of surgical sponges, we concluded that the deep learning model has higher sensitivity, while the human observer has higher specificity. These characteristics indicate that the deep learning system that is complementary to humans could support the clinical workflow in operation rooms for prevention of retained surgical items.
UR - http://www.scopus.com/inward/record.url?scp=85145181398&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85145181398&partnerID=8YFLogxK
U2 - 10.1007/s11548-022-02816-8
DO - 10.1007/s11548-022-02816-8
M3 - Article
C2 - 36583837
AN - SCOPUS:85145181398
SN - 1861-6410
JO - Computer-Assisted Radiology and Surgery
JF - Computer-Assisted Radiology and Surgery
ER -