Info: Zenodo’s user support line is staffed on regular business days between Dec 23 and Jan 5. Response times may be slightly longer than normal.

Published October 25, 2021 | Version v1
Conference paper Open

Some like it tough: Improving model generalization via progressively increasing the training difficulty

  • 1. JOANNEUM RESEARCH

Description

In this work, we propose to progressively increase the training difficulty during learning a neural network model via a novel strategy which we call mini-batch trimming. This strategy makes sure that the optimizer puts its focus in the later training stages on the more difficult samples, which we identify as the ones with the highest loss in the current mini-batch. The strategy is very easy to integrate into an existing training pipeline and does not necessitate a change of the network model. Experiments on several image classification problems show that mini-batch trimming is able to increase the generalization ability (measured via final test error) of the trained model.

Files

aspai_2021_fassold_pdf_version.pdf

Files (349.2 kB)

Name Size Download all
md5:da0eca6acff8e644aa86bdfaff378840
349.2 kB Preview Download

Additional details

Funding

AI4Media – A European Excellence Centre for Media, Society and Democracy 951911
European Commission