Published June 30, 2020 | Version v1
Journal article Open

ABC_LSTM: Optimizing Parameters of Deep LSTM using ABC Algorithm for Big Datasets

  • 1. Deptt. Of Computer Science, Guru Jambeshwar University of Science and Technology, Hisar, Haryana.

Description

Long Short Term Memory Network is the variant of RNN (Recurrent Neural Network) popularly used in various domains, particularly for sequence prediction tasks. For deep networks, number of hidden layers in the network is high and thus, the time complexity of the network increases. Moreover, with the increase in the size of datasets, it becomes very difficult to tune these complex networks manually (as the network may take several days/weeks to run). Thus, to minimize the time required to run an algorithm and for better accuracy, there is a need to automate the task of tuning the parameters of the network. To automatically tune the parameters of the networks, various researchers have used numerous Metaheuristic approaches like Ant Colony Optimization, Genetic Algorithm, Simulated Annealing etc. in the past which provides us with the near optimal solution. In the proposed ABC_LSTM algorithm, traditional Artificial Bee Colony algorithm has been implemented to optimize the number of hidden neurons of LSTM networks with 2 hidden layers. Based on the experimental results, it can be concluded that up to a certain point increasing the number of bees and iterations gives us the solution with the least MAE value, thereby improving the accuracy of the model.

Files

D7649049420.pdf

Files (445.4 kB)

Name Size Download all
md5:8ec97e1477c107bd48697c082af425ef
445.4 kB Preview Download

Additional details

Related works

Is cited by
Journal article: 2249-8958 (ISSN)

Subjects

ISSN
2249-8958
Retrieval Number
D7649049420/2020©BEIESP