Info: Zenodo’s user support line is staffed on regular business days between Dec 23 and Jan 5. Response times may be slightly longer than normal.

Published May 1, 2014 | Version v1
Journal article Open

A family of discriminative training criteria based on the F-divergence for deep neural networks

Description

We present novel bounds on the classification error which are based on the f-Divergence and, at the same time, can be used as practical training criteria. There exist virtually no studies which investigate the link between the f-Divergence, the classification error and practical training criteria. So far only the Kullback-Leibler f-Divergence has been examined in this context to formulate a bound on the classification error and to derive the cross-entropy criterion. We extend this concept to a larger class of f-Divergences. We also successfully investigate if the novel training criteria based on the f-Divergence are suited for frame-wise training of deep neural networks on the Babel Vietnamese and Bengali speech recognition tasks.

Files

article.pdf

Files (127.9 kB)

Name Size Download all
md5:7fa2a4467a17482b193007a25c26eed1
127.9 kB Preview Download