International Journal of Neutrosophic Science IJNS 2690-6805 2692-6148 10.54216/IJNS https://www.americaspg.com/journals/show/4108 2020 2020 Solving Unconstrained Minimization Problems and Training Neural Networks via Enhanced Conjugate Gradient Algorithms College of Computer Science and Mathematics, University of Mosul, Mosul, Iraq Talal Talal Department of Information Systems and Technology, Kuwait Technical College, Kuwait City, Kuwait Issam A. R. Moghrabi Department of Mathematics, College of Science, Buraydah, Qassim University, Saudi Arabia Talal M. Alharbi Department of Mathematics, College of Science, University of Zakho, Zakho, Kurdistan Region, Iraq Alaa Luqman Ibrahim Artificial neural networks have become a cornerstone of modern artificial intelligence, powering progress in a wide range of fields. Their effective training heavily depends on techniques from unconstrained optimization, with iterative methods based on gradients being especially common. This study presents a new variant of the conjugate gradient method tailored specifically for unconstrained optimization tasks. The method is carefully designed to meet the sufficient descent condition and ensures global convergence. Comprehensive numerical testing highlights its advantages over traditional conjugate gradient techniques, showing improved performance in terms of iteration counts, function evaluations, and overall computational time across a variety of problem sizes. Additionally, this new approach has been successfully used to improve neural network training. Experimental results show faster convergence and better accuracy, with fewer training iterations and reduced mean squared error compared to standard methods. Overall, this work offers a meaningful contribution to optimization strategies in neural network training, displaying the method is potential to tackle the complex optimization problems often encountered in machine learning. 2026 2026 220 233 10.54216/IJNS.270120 https://www.americaspg.com/articleinfo/21/show/4108