HOSPI-Tools Dataset - DSLR
Description
We are working to develop a comprehensive dataset of surgical tools based on specialities, with a hierarchical structure – speciality, pack, set and tool. We belive that this dataset can be useful for computer vision and deep learning research into surgical tool tracking, management and surgical training and audit. We have therefore created an initial dataset of surgical tool (instrument and implant) images, captured using under different lighting conditions and with different backgrounds. We captured RGB images of surgical tools using a DSLR camera and webcam on site in a major hospital under realistic conditions and with the surgical tools currently in use. Image backgrounds in our initial dataset were essentially flat colours, even though different colour backgrounds were used. As we further developed our dataset, we will try to include much greater occlusions, illumination changes, and the presence of blood, tissue and smoke in the images which would be more reflective of crowded, messy, real-world conditions.
Illumination sources included natural light – direct sunlight and shaded light – LED, halogen and fluorescent lighting, and this accurately reflected the illumination working conditions within the hospital. Distances of the surgical tools to the camera to the object ranged from 60 to 150 cms., and the average class size was 74 images. Images captured included individual object images as well as cluttered, clustered and occluded objects. Our initial focus was on Orthopaedics and General Surgery, two out of the 14 surgical specialities. We selected these specialities since general surgery instruments are the most commonly used tools across all surgeries and provide instrument volume, while orthopaedics provides variety and complexity given the wide range of procedures, instruments and implants used in orthopaedic surgery. We will add other specialities as we develop this dataset, to reflect the complexities inherent in each of the surgical specialities. This dataset was designed to offer a large variety of tools, arranged hierarchically to reflect how surgical tools are organised in real-world conditions.
If you do find our dataset useful, please cite our papers in your work:
Rodrigues, M., Mayo, M, and Patros, P. (2022). OctopusNet: Machine Learning for Intelligent Management of Surgical Tools. Published in “Smart Health”, Volume 23, 2022. https://doi.org/10.1016/j.smhl.2021.100244
Rodrigues, M., Mayo, M, and Patros, P. (2021). Evaluation of Deep Learning Techniques on a Novel Hierarchical Surgical Tool Dataset. Accepted paper at The 2021 Australasian Joint Conference on Artificial Intelligence. 2021. To be Published in Lecture Notes in Computer Science series.
Rodrigues, M., Mayo, M, and Patros, P. (2021). Interpretable deep learning for surgical tool management. In M. Reyes, P. Henriques Abreu, J. Cardoso, M. Hajij, G. Zamzmi, P. Rahul, and L. Thakur (Eds.), Proc 4th International Workshop on Interpretability of Machine Intelligence in Medical Image Computing (iMIMIC 2021) LNCS 12929 (pp. 3-12). Cham: Springer.