Published May 6, 2020 | Version v1
Software Open

On the Performance of Oversampling Techniques for Class Imbalance Problems

  • 1. University of Leiden
  • 2. Honda Research Institute Europe GmBH

Description

This file is the source code used in the paper below:

Jiawen Kong, Thiago Rios, Wojtek Kowalczyk, Stefan Menzel and Thomas Bäck, “On the Performance of Oversampling Techniques for Class Imbalance Problems” in the 24th Pacific-Asia Conference on Knowledge Discovery and Data Mining, Singapore, 11-14 May 2020, doi: 10.1007/978-3-030-47436-2_7

Although over 90 oversampling approaches have been developed in the imbalance learning domain, most of the empirical study and application work are still based on the “classical” resampling techniques. In this paper, several experiments on 19 benchmark datasets are set up to study the efficiency of six powerful oversampling approaches, including both “classical” and new ones. According to our experimental results, oversampling techniques that consider the minority class distribution (new ones) perform better in most cases and RACOG gives the best performance among the six reviewed approaches. We further validate our conclusion on our real-world inspired vehicle datasets and also find applying oversampling techniques can improve the performance by around 10%. In addition, seven data complexity measures are considered for the initial purpose of investigating the relationship between data complexity measures and the choice of resampling techniques. Although no obvious relationship can be abstracted in our experiments, we find F1v value, a measure for evaluating the overlap which most researchers ignore, has a strong negative correlation with the potential AUC value (after resampling).

Files

Files (13.2 kB)

Name Size Download all
md5:e149dd275cb6b1195369dbc350e7b8f1
13.2 kB Download

Additional details

Funding

ECOLE – Experience-based Computation: Learning to Optimise 766186
European Commission