An Efficient Two-State Feature Attention-Based GRU for Text Classification
Abstract
Text classification has become crucial for mechanically sorting documents into specific categories. The goal of classification is to assign a predefined group or class to an instance based on its characteristics. To attain precise text categorization, a feature selection scheme is employed to categorize significant features and eliminate irrelevant, undesirable, and noisy ones, thereby reducing the dimensionality of the feature space. Many advanced deep learning algorithms have been developed to handle text classification drawbacks. Recurrent neural networks (RNNs) are broadly employed in text classification tasks. In this paper, we referred to a novel Two-state GRU based on a Feature Attention strategy, known as Two-State Feature Attention GRU (TS-FA-GRU). The proposed framework identifies and categorizes word polarity through consecutive mechanisms and word-feature capture. Furthermore, the developed study incorporates a pre-feature attention TS-FA-GRU to capture essential features at an early stage, followed by a post-feature attention GRU that mimics the decoder’s function to refine the extracted features. To enhance computational performance, the reset gate in the ordinary GRU is replaced with an update gate, which helps to reduce redundancy and complexity. The effectiveness of the developed model was tested on five benchmark text datasets and compared with five well-established traditional text classification methods. The proposed TS-FA-GRU model demonstrated superior performance over several traditional approaches regarding convergence rate and accuracy. Experimental outcomes revealed that the TS-FA-GRU model achieved excellent text classification accuracies of 93.86%, 92.69%, 94.73%, 92.46%, and 88.23 on the 20NG, R21578, AG News, IMDB, and Amazon review dataset respectively. Moreover, the results indicated that the proposed model effectively minimized the loss function and captured long-term dependencies, leading to exceptional outcomes when compared to the traditional approachesDOI:
https://doi.org/10.31449/inf.v49i15.7474Downloads
Published
How to Cite
Issue
Section
License
Authors retain copyright in their work. By submitting to and publishing with Informatica, authors grant the publisher (Slovene Society Informatika) the non-exclusive right to publish, reproduce, and distribute the article and to identify itself as the original publisher.
All articles are published under the Creative Commons Attribution license CC BY 3.0. Under this license, others may share and adapt the work for any purpose, provided appropriate credit is given and changes (if any) are indicated.
Authors may deposit and share the submitted version, accepted manuscript, and published version, provided the original publication in Informatica is properly cited.







