Journal article Open Access

Towards Growing Self-Organizing Neural Networks with Fixed Dimensionality

Guojian Cheng; Tianshi Liu; Jiaxin Han; Zheng Wang

The competitive learning is an adaptive process in which the neurons in a neural network gradually become sensitive to different input pattern clusters. The basic idea behind the Kohonen-s Self-Organizing Feature Maps (SOFM) is competitive learning. SOFM can generate mappings from high-dimensional signal spaces to lower dimensional topological structures. The main features of this kind of mappings are topology preserving, feature mappings and probability distribution approximation of input patterns. To overcome some limitations of SOFM, e.g., a fixed number of neural units and a topology of fixed dimensionality, Growing Self-Organizing Neural Network (GSONN) can be used. GSONN can change its topological structure during learning. It grows by learning and shrinks by forgetting. To speed up the training and convergence, a new variant of GSONN, twin growing cell structures (TGCS) is presented here. This paper first gives an introduction to competitive learning, SOFM and its variants. Then, we discuss some GSONN with fixed dimensionality, which include growing cell structures, its variants and the author-s model: TGCS. It is ended with some testing results comparison and conclusions.

Files (412.1 kB)
Name Size
7656.pdf
md5:c52bb8276ae3d1afd4dc0ab144e2ffd4
412.1 kB Download
5
4
views
downloads
All versions This version
Views 55
Downloads 44
Data volume 1.6 MB1.6 MB
Unique views 55
Unique downloads 33

Share

Cite as