Published February 11, 2022 | Version v1
Report Open

Confidence-driven Weighted Retraining for Predicting Safety-Critical Failures in Autonomous Driving Systems


Safe handling of hazardous driving situations is a task of high practical relevance for building reliable and trustworthy cyber-physical systems such as autonomous driving systems. This task necessitates an accurate prediction system of the vehicle's confidence to prevent potentially harmful system failures on the occurrence of unpredictable conditions that make it less safe to drive. In this paper, we discuss the challenges of adapting a misbehavior predictor with knowledge mined during the execution of the main system. Then, we present a framework for the continual learning of misbehavior predictors, which records in-field behavioral data to determine what data are appropriate for adaptation. Our framework guides adaptive retraining using a novel combination of in-field confidence metric selection and reconstruction error-based weighing. We evaluate our framework to improve a misbehavior predictor from the literature on the Udacity simulator for self-driving cars. Our results show that our framework can reduce the false positive rate by a large margin and can adapt to nominal behavior drifts while maintaining the original capability to predict failures up to several seconds in advance.



Files (9.8 MB)

Name Size Download all
9.8 MB Preview Download

Additional details

Related works

Is obsoleted by
Journal article: 10.1002/smr.2386 (DOI)


PRECRIME – Self-assessment Oracles for Anticipatory Testing 787703
European Commission