Project deliverable Open Access
Nyi Nyi Htun; Diego Rojo Garcia; Katrien Verbert
This document presents an update of deliverable 5.3 where we demonstrate a decision support system that uses visualisation techniques to put domain experts in the loop of feature selection for the development of interpretable machine learning models. In the previous version of the deliverable, we delivered AHMoSE which compares and explains the predicted outcomes of various machine learning models and helps domain experts to select the models that fit their knowledge the most. However, we also discovered that factors such as end-users' understanding of individual features and the number of features can influence the interpretability of a system. Thus, end-users should be involved since the feature selection process.
In this deliverable, we present a system named GaCoVi (Gapped Correlation Visualisation), designed to put viticulture experts in the loop of the feature selection process which is a preliminary to the decision support offered by AHMoSE. We used two of the pilots, AUA and INRAE, to demonstrate GaCoVi with two real-life datasets.
This document is structured as follows. Section 1 lays out an introduction to the deliverable describing our previous version and motivations. In Section 2, GaCoVi is described together with the development technology we utilised. In Section 3, we provide a usage manual with instructions on how to obtain the source code. This document concludes with Section 4 where a summary of the deliverable is underlined.
D5.3 - Trust-aware decision support system_v3_(Submitted to EC).pdf