Published November 26, 2020 | Version v1
Poster Open

Deep Learning-based Object Detection for a Quality Control Application in the Sterilization Unit of a Hospital

  • 1. University of the Balearic Islands

Description

Machine vision systems are emerging as increasingly popular solutions in automated quality and process control applications. By allowing non-contact, and therefore non-destructive inspection, techniques based on machine vision are especially suitable when correct handling of the object under inspection is critical. This is the quality control problem that we address in this work: it consists of the detection of a series of control elements that are placed in boxes and bags that contain the surgical instruments with which surgeons and nurses must be supplied before starting a surgical procedure. The presence of these elements shows that the instruments have been correctly subjected to the required sanitation processes. To address this problem, in this work we detect rectangular areas of the image that contain the objects of interest (traceability label, seal, various paper-tape filters that change with temperature, etc.). Additionally, we consider the problem of detecting oriented rectangular areas (not parallel to the image axes) to improve localization. The detector makes use of deep convolutional neural networks that determine by regression the parameters of the rectangular areas, adopting a two-stage strategy: the first stage locates the object through a rectangular box (bounding box) parallel to the image axes, the second stage estimates the parameters of the maximal oriented box contained in the first and containing the object. After training the system with a collection of representative images, where the objects to be detected appear in various orientations and scales, the ability of the system to detect the control elements in a collection of test images has been verified. The experimentation has started with a previous clustering process to determine a set of scales and appropriate aspect ratios as starting points for the regressions. Next, various box parameterizations have been considered to determine which one allowed for the highest correct detection rate. The final results point out, for the non-oriented boxes, to an average precision (P) greater than 98% for an average sensitivity (R) greater than 93%, using 4 clusters; for the oriented boxes case, P = 66% and R = 73%, using a 2-parameter model to orient the boxes.

Notes

This work is also supported by projects EU-H2020 BUGWRIGHT2 (GA 871260), PGC2018-095709-B-C21 (MCIU/AEI/FEDER, UE), and PROCOE/4/2017 (Govern Balear, 50% P.O. FEDER 2014-2020 Illes Balears).

Files

idisba2020_poster_30x40_Yao_Ortiz_Bonnin.pdf

Files (3.9 MB)

Name Size Download all
md5:a2a57bbf67a0b713dfb675ca99a2de4a
3.9 MB Preview Download

Additional details

Funding

ROBINS – Robotics Technology for Inspection of Ships 779776
European Commission
BugWright2 – Autonomous Robotic Inspection and Maintenance on Ship Hulls and Storage Tanks 871260
European Commission