Report Open Access

Fail-Safe Execution of Deep Learning based Systems through Uncertainty Monitoring

Michael Weiss; Paolo Tonella


DataCite XML Export

<?xml version='1.0' encoding='utf-8'?>
<resource xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://datacite.org/schema/kernel-4" xsi:schemaLocation="http://datacite.org/schema/kernel-4 http://schema.datacite.org/meta/kernel-4.1/metadata.xsd">
  <identifier identifierType="DOI">10.5281/zenodo.5055710</identifier>
  <creators>
    <creator>
      <creatorName>Michael Weiss</creatorName>
      <affiliation>Università della Svizzera italiana</affiliation>
    </creator>
    <creator>
      <creatorName>Paolo Tonella</creatorName>
      <affiliation>Università della Svizzera italiana</affiliation>
    </creator>
  </creators>
  <titles>
    <title>Fail-Safe Execution of Deep Learning based Systems through Uncertainty Monitoring</title>
  </titles>
  <publisher>Zenodo</publisher>
  <publicationYear>2020</publicationYear>
  <dates>
    <date dateType="Issued">2020-09-01</date>
  </dates>
  <resourceType resourceTypeGeneral="Report"/>
  <alternateIdentifiers>
    <alternateIdentifier alternateIdentifierType="url">https://zenodo.org/record/5055710</alternateIdentifier>
  </alternateIdentifiers>
  <relatedIdentifiers>
    <relatedIdentifier relatedIdentifierType="DOI" relationType="IsObsoletedBy">10.1109/ICST49551.2021.00015</relatedIdentifier>
    <relatedIdentifier relatedIdentifierType="DOI" relationType="IsVersionOf">10.5281/zenodo.5055709</relatedIdentifier>
  </relatedIdentifiers>
  <rightsList>
    <rights rightsURI="https://creativecommons.org/licenses/by/4.0/legalcode">Creative Commons Attribution 4.0 International</rights>
    <rights rightsURI="info:eu-repo/semantics/openAccess">Open Access</rights>
  </rightsList>
  <descriptions>
    <description descriptionType="Abstract">&lt;p&gt;Modern software systems rely on Deep Neural Networks (DNN) when processing complex, unstructured inputs, such as images, videos, natural language texts or audio signals.&amp;nbsp;Provided the intractably large size of such input spaces, the intrinsic limitations of learning algorithms&amp;nbsp; and the ambiguity about the expected predictions for some of the inputs, not only there is no guarantee that DNN&amp;#39;s predictions are always correct, but rather developers must safely assume a low, though not negligible, error probability.&amp;nbsp;A fail-safe Deep Learning based System (DLS) is one equipped to handle DNN faults by means of a supervisor, capable of recognizing predictions that should not be trusted and that should activate a healing procedure bringing the DLS to a safe state.&lt;/p&gt;

&lt;p&gt;In this paper, we propose an approach to use DNN uncertainty estimators to implement such supervisor.&amp;nbsp;We first discuss advantages and disadvantages of existing approaches to measure uncertainty for DNNs&amp;nbsp;and propose novel metrics for the empirical assessment of the&amp;nbsp; supervisor that rely on such approaches.&amp;nbsp;We then describe our publicly available tool Uncertainty-Wizard, which allows transparent estimation of uncertainty for regular tf.keras DNNs.&amp;nbsp;Lastly, we discuss a large-scale&amp;nbsp; study conducted on four different subjects to empirically validate the approach,&amp;nbsp;reporting the lessons-learned as guidance for software engineers who intend to monitor uncertainty for fail-safe execution of DLS.&lt;/p&gt;</description>
  </descriptions>
  <fundingReferences>
    <fundingReference>
      <funderName>European Commission</funderName>
      <funderIdentifier funderIdentifierType="Crossref Funder ID">10.13039/100010661</funderIdentifier>
      <awardNumber awardURI="info:eu-repo/grantAgreement/EC/H2020/787703/">787703</awardNumber>
      <awardTitle>Self-assessment Oracles for Anticipatory Testing</awardTitle>
    </fundingReference>
  </fundingReferences>
</resource>
27
28
views
downloads
All versions This version
Views 2727
Downloads 2828
Data volume 25.3 MB25.3 MB
Unique views 2323
Unique downloads 2525

Share

Cite as