Automated Detection and Prevention of Overfitting in Neural Networks

Keywords: overfitting, neural networks, model generalization, automated detection, prevention strategies, regularization techniques, noise, optimization

Abstract

The article explores the issue of overfitting in neural networks, a problem that has gained popularity with the increasing complexity of artificial intelligence models. As neural networks become more intricate and powerful, the risk of overfitting, where the model learns noise and specifics of the training data instead of qualitatively generalizing new data, becomes a significant concern. The intricacies of overfitting and its detrimental impact on model performance have been investigated, emphasizing the need for sophisticated approaches to detect and mitigate this problem. The article discusses various detection methods, ranging from statistical measurements to advanced model-based approaches, highlighting their strengths and limitations. A review of preventive strategies, such as regularization methods, dropout, and ensemble techniques, is also presented, all aimed at improving the model's generalization. Recent advancements in automated machine learning (AutoML) and hyperparameter optimization are explored, considering their effectiveness in restraining model overtraining without hindering its accuracy. The article introduces the idea of using noise injection to suppress overfitting in various neural networks and demonstrates that noise can dampen overfitting in a multilayer perceptron (MLP) and long short-term memory (LSTM). This work shows that noise can be a beneficial solution for mitigating overfitting in a Hopfield neural network (HNN) and, more importantly, further suggests that the imperfections in semiconductor devices serve as a rich source of solutions for accelerating hardware technologies in the era of artificial intelligence.

References

1. Muquri A. L. I., Konstholm S. Data augmentation and related opportunity cost for managing the contemporary data sparsity data augmentation and related opportunity cost for managing the contemporary data sparsity. 2021.
2. Khalifa N. E., Hamed Taha M., Hassanien A. E., Selim I. Deep galaxy V2: Robust deep convolutional neural networks for galaxy morphology classi cations. Int. Conf. Comput. Sci. Eng. ICCSE 2018 – Proc., 2018. Р. 1–6. DOI:10.1109/ICCSE1.2018.8374210.
3. Marin I., Skelin A. K., Grujic T. Empirical evaluation of the effect of optimization and regularization techniques on the generalization performance of deep convolutional neural network. Appl. Sci. 2020. vol. 10, №. 21. Р. 1–30. DOI:10.3390/app10217817.
4. Kadhim Zahraa, Abdullah Hasanen, Ghathwan Khalil. Automatically Avoiding Overfitting in Deep Neural Networks by Using Hyper-Parameters Optimization Methods. International Journal of Online and Biomedical Engineering (iJOE). 2023. DOI:19. 146-162. 10.3991/ijoe.v19i05.38153.
5. Sabiri Bihi, Asri Bouchra, Rhanoui Maryem. Efficient Deep Neural Network Training Techniques for Overfitting Avoidance. 2023. DOI:10.1007/978-3-031-39386-0_10.

Abstract views: 52
PDF Downloads: 54
Published
2024-03-27
How to Cite
Penzenyk , A. (2024). Automated Detection and Prevention of Overfitting in Neural Networks. COMPUTER-INTEGRATED TECHNOLOGIES: EDUCATION, SCIENCE, PRODUCTION, (54), 36-42. https://doi.org/10.36910/6775-2524-0560-2024-54-04