An Improved Hestenes-Stiefel Conjugate Gradient Method and Its Application
DOI:
https://doi.org/10.11113/matematika.v41.n1.1586Abstract
The conjugate gradient method is an important method in numerical optimization, which uses the gradient information to construct the conjugate search direction, ensures fast convergence, and aims to achieve fast minimization of the objective function. In this paper, the author proposes a new conjugate gradient method (referred to as the NMHS method) with a simple form and easy to implement and proves the sufficient descent and global convergence of the algorithm under strong Wolfe-Powell line search. To verify the numerical performance of the NMHS method, numerical experiments were conducted on the algorithm, and the experimental results show that the NMHS method is significantly better than the existing method in terms of convergence of the method, the
number of iterations required for convergence, and the CPU time consumed. Finally, the NMHS method was applied in practice, and a regression analysis model was established to predict the admission rate of Chinese master’s degree students. Through calculations, it can be found that the NMHS method is superior to the Least Squares method and Trend Line method in practical applications, and its prediction is more accurate than the Least Squares method and Trend Line method.