تعداد نشریات | 418 |
تعداد شمارهها | 9,997 |
تعداد مقالات | 83,560 |
تعداد مشاهده مقاله | 77,801,364 |
تعداد دریافت فایل اصل مقاله | 54,843,979 |
A New Hybrid Conjugate Gradient Method Based on Secant Equation for Solving Large Scale Unconstrained Optimization Problems | ||
Iranian Journal of Optimization | ||
مقاله 4، دوره 12، شماره 1، شهریور 2020، صفحه 33-44 اصل مقاله (568.21 K) | ||
نوع مقاله: Research Paper | ||
نویسندگان | ||
Nasiru Salihu* 1؛ Mathew Remilekun Odekunle2؛ Mohammed Yusuf Waziri3؛ Abubakar Sani Halilu4 | ||
1Department of Mathematics, School of Physical Science, Moddibo Adama University of Technology, Yola. | ||
2Department of Mathematics, School of Physical Sciences, Modibbo Adama University of Technology, Yola, Nigeria. | ||
3Department of Mathematical Sciences, Faculty of Sciences, Bayero University, Kano, Nigeria. | ||
4Department of Mathematics and Computer Science, Sule Lamido University, Ka n Hausa, Nigeria. | ||
چکیده | ||
There exist large varieties of conjugate gradient algorithms. In order to take advantage of the attractive features of Liu and Storey (LS) and Conjugate Descent (CD) conjugate gradient methods, we suggest hybridization of these methods in which the parameter is computed as a convex combination of and respectively which the conjugate gradient (update) parameter was obtained from Secant equation. The algorithm generates descent direction and when the iterate jam, the direction satisfy sufficient descent condition. We report numerical results demonstrating the efficiency of our method. The hybrid computational scheme outperform or comparable with known conjugate gradient algorithms. We also show that our method converge globally using strong Wolfe condition. | ||
کلیدواژهها | ||
Unconstrained optimization؛ conjugate gradient algorithm؛ large scale optimization problem؛ secant equation؛ global convergence | ||
آمار تعداد مشاهده مقاله: 889 تعداد دریافت فایل اصل مقاله: 572 |