INVESTING THE IMPACT OF HYPERPARAMETER TUNING ON THE PERFORMANCE OF DEEP LEARNING MODELS OF TEXT CLASSIFICATION TASKS

Authors

  • Maham Baig
  • Muzamil Malik
  • Nisar Ahmed Memon
  • Sohail Ahmad

Keywords:

Deep learning models, natural language processing, text classification tasks, hyperparameters, Convolutional Neural Networks (CNNs), Long Short-Term Memory networks (LSTMs), BERT

Abstract

Deep learning models have revolutionized the field of natural language processing, especially in text classification tasks. The effectiveness of these models, however, is very much dependent on how the hyperparameters that are chosen. This study examined how hyperparameter optimization affects the performance of three popular deep neural networks Convolutional Neural Networks (CNNs), Long Short-Term Memory networks (LSTMs), and BERT on popular text classification datasets such as IMDb, AG News, and SST-2. The researchers used manual and automated tuning methods, such as grid search and random search to systematically explore important hyperparameters, such as learning rate, batch size, dropout rate, number of epochs, and optimizer type. The results showed that the best hyperparameter settings resulted in high accuracy, F1-score, and generalization rates in all three architectures. BERT had the best post-tuning accuracy of 95.1, then LSTM with 91.3 and CNN with 90.1. The findings validated that hyperparameter optimization was a decisive factor in maximizing model performance and minimizing overfitting. The research has practical implications to researchers and practitioners interested in developing efficient and high-performing text classification systems with deep learning.

Downloads

Published

2026-04-28

How to Cite

Maham Baig, Muzamil Malik, Nisar Ahmed Memon, & Sohail Ahmad. (2026). INVESTING THE IMPACT OF HYPERPARAMETER TUNING ON THE PERFORMANCE OF DEEP LEARNING MODELS OF TEXT CLASSIFICATION TASKS. Spectrum of Engineering Sciences, 4(4), 1337–1345. Retrieved from https://www.thesesjournal.com/index.php/1/article/view/2574