A New Method of Text Classification Based on Recurrent Neural Network
Authors/Creators
Description
With the development of modern information science and technology, the number of Internet users continues to increase substantially, and the processing of massive data is now a hot spot in data research. Artificial Neural Network (ANN) plays a crucial role in the screening and processing of big data. Artificial neural network has successfully solved many practical problems that have puzzled people for many years in the fields of computer vision, machine translation, automatic driving, etc. Therefore, artificial neural network has been increasingly applied to text classification in Natural Language Processing, NLP), which is a hot and difficult point in NLP at present. Using artificial neural network can not only process massive data quickly and efficiently, but also improve the accuracy of data processing to a certain extent. However, there are many differences between English and Chinese in character level and word level. Compared with English, the number of Chinese in character level and word level is larger than that of English. At present, the Chinese text classification technology still has some problems in processing speed, accuracy and word segmentation. In this paper, some contents of THUC News data set compiled from Sina news data are extracted for text classification. Firstly, a word embedding matrix based on character level is proposed, and the vector dimension of each word is only 13 dimensions. Through experimental comparison, it is found that the word vector based on character level proposed in this paper has better classification effect than the word vector trained by word2vec. A tower-shaped three-layer bidirectional network structure based on LSTM(Long Short-Term Memory) is also designed, which is connected to the full connection layer composed of three layers of DNN(Deep Neural Networks) networks. Through experimental comparison, it is found that the network model designed in this paper achieves better effect than Text CNN network. In the aspect of convolutional neural network, this paper puts forward a weight compensation scheme to solve the problem that convolutional kernel ignores edge information and extends it to higher dimensions, and puts forward an optimized pooling structure to solve the problem of erroneous information extraction caused by traditional maximum pooling and average pooling. Through experiments, the optimized pooling designed in this paper is superior to the maximum pooling in training convergence speed and classification effect. In addition, a convolution neural network with five parallel connections based on one-dimensional convolution is designed. Through experimental verification and analysis, the parallel convolution neural network designed in this paper achieves better results than the Text CNN network structure. Finally, a CRNN (Convective Recurrent Neural Networks) network which combines CNN (Convective Neural Networks) and RNN( Recurrent Neural Network) is designed. Through experimental verification and analysis, the text classification effect is also better than that of Text CNN network structure.
Files
ijaet v5-1-2023-03.pdf
Files
(608.7 kB)
| Name | Size | Download all |
|---|---|---|
|
md5:f0de90c4c2594d7d1e0995bf62b1fae3
|
608.7 kB | Preview Download |