数学物理学报, 2021, 41(6): 1871-1879 doi:

论文

一个改进的WYL型三项共轭梯度法

朱志斌,, 耿远航,

桂林电子科技大学数学与计算科学学院 & 广西高校数据分析与计算重点实验室 广西桂林 541004

A Modified Three-Term WYL Conjugate Gradient Method

Zhu Zhibin,, Geng Yuanhang,

School of Mathematics and Computing Science, Guilin University of Electronic Technology&Guangxi Colleges and Universities Key Laboratory of Data Analysis and Computation, Guangxi Guilin 541004

通讯作者: 耿远航, 18434164700@163.com

收稿日期: 2020-11-6  

基金资助: 国家自然科学基金.  61967004
国家自然科学基金.  11901137
广西自动检测技术与仪器重点实验室项目.  YQ20113
广西自动检测技术与仪器重点实验室项目.  YQ20114
广西密码学与信息安全重点实验室研究课题.  GCIS201927
广西密码学与信息安全重点实验室研究课题.  GCIS201621
桂林电子科技大学研究生教育创新计划资助项目.  2021YCXS118

Received: 2020-11-6  

Fund supported: the NSFC.  61967004
the NSFC.  11901137
the Guangxi Key Laboratory of Automatic Detecting Technology and Instruments.  YQ20113
the Guangxi Key Laboratory of Automatic Detecting Technology and Instruments.  YQ20114
the Guangxi Key Laboratory of Cryptography and Information Security.  GCIS201927
the Guangxi Key Laboratory of Cryptography and Information Security.  GCIS201621
the Innovation Project of GUET Graduate Education.  2021YCXS118

作者简介 About authors

朱志斌,E-mail:zhuzb@guet.edu.cn , E-mail:zhuzb@guet.edu.cn

Abstract

Conjugate gradient method is an important algorithm to solve a class of large-scale optimization problems, and it has the advantages of simple calculation and fast convergence. This method satisfies the sufficient descent condition without relying on any line search method, and it has global convergence under the modified Armijo line search. Numerical results of experiments show that the method is effective.

Keywords: Unconstrained optimization ; WYL conjugate gradient ; Global convergence ; Sufficient descent

PDF (466KB) 元数据 多维度评价 相关文章 导出 EndNote| Ris| Bibtex  收藏本文

本文引用格式

朱志斌, 耿远航. 一个改进的WYL型三项共轭梯度法. 数学物理学报[J], 2021, 41(6): 1871-1879 doi:

Zhu Zhibin, Geng Yuanhang. A Modified Three-Term WYL Conjugate Gradient Method. Acta Mathematica Scientia[J], 2021, 41(6): 1871-1879 doi:

1 引言

共轭梯度法是求解大规模无约束优化问题$ \min\{f(x)|x\in {{\Bbb R}}^n\} $的有效方法之一, 其迭代公式为

其中, $ g_k = \nabla f(x_k) $, $ d_k $为搜索方向, $ \alpha_k $为步长, $ \beta_k $为搜索方向$ d_k $的调控参数.不同的$ \beta_k $对应不同的共轭梯度法.比较经典的共轭梯度法有Hestenes-Stiefel(HS)方法[1], Fletcher-Reeves(FR)方法[2], Polak-Ribiére-Polyak(PRP)方法[3, 4]和Dai-Yuan(DY)方法[5], 它们的调控参数$ \beta_k $

当采用精确线性搜索并且目标函数为二次凸函数时, 以上四种共轭梯度法是等价的.然而不采用精确线搜索或者目标函数为一般函数时, 这四种方法的理论结果和数值效果各不相同, 其中分子为$ \|g_{k}\|^2 $的FR方法以及DY方法数值结果较差, 但具有良好的理论结果; 分子为$ g_k^T(g_k-g_{k-1}) $的PRP方法以及HS方法的数值效果比较好, 但是理论结果比较差.为得到数值实验和理论结果都很好的共轭梯度方法, 许多学者对这些经典方法做了不少修正.

PRP方法之所以数值实验效果比较好, 是因为对一般函数来说, PRP方法具有自动重启的优良性能.但是针对它的理论结果较差这一点, Powell在文献[6]中建议将$ \beta_k $限制为非负, 基于此建议Gilbert和Nocedal[7]考虑如下的$ PRP^+ $方法

得出$ PRP^+ $方法在假设充分下降条件满足时, 对一般函数采用Wolfe线搜索或者精确线搜索具有全局收敛性.

为了保证PRP公式非负, Wei、Yao和Liu在文献[8]提出了PRP方法的一个变型, 称为WYL方法或者VPRP方法, 其中调整参数$ \beta_k $

WYL方法不仅具有PRP方法的良好数值性能而且在强Wolfe线搜索条件下是全局收敛的.之后, Huang、Wei以及Yao在文献[9]中证明了当$ \sigma<\frac{1}{4} $时($ \sigma $为Wolfe线搜索的参数), 该方法在强Wolfe线搜索下具有充分下降性以及全局收敛性.同时, 也有不少学者在WYL方法基础上对调整参数$ \beta_k $加以修改, 如文献[10-14].

Zhang[15]提出了一个基于PRP的三项共轭梯度法, 称为MPRP方法, 其公式如下

其中, 步长$ \alpha_k $是在修改的Armijo线搜索下, 令$ \alpha_k = \max\{\rho^j, j = 0, 1, 2, \cdots \} $, 满足

$ \begin{equation} f(x_k+\alpha_k d_k)\leq f(x_k)-\delta \alpha_k^2\|d_k\|^2, \; \delta \in (0, 1), \end{equation} $

是充分下降且全局收敛的.有不少学者在两项共轭梯度法的基础上加以修改, 提出三项共轭梯度法, 如文献[16-20].修改之后的三项共轭梯度法在数值效果和理论结果上比两项有很大改善.

文献[21]提出了一类新的WYL型三项共轭梯度法, 称为MWYL方法, 其公式如下

参数$ c\geq0 $为常数, 且在强Wolfe线搜索方式下具有充分下降及全局收敛性.

受此启发, 下面给出一个新的WYL型三项共轭梯度算法并利用Zhang[15]的线搜索方式(1.1), 算法公式为

$ \begin{equation} d_k = \left\{\begin{array}{ll} -g_k, & k = 0, \\ { } -g_k+\beta_k^{TWYL} d_{k-1}-\theta_k (g_k-\frac{\|g_k\|}{\|g_{k-1}\|}g_{k-1}), & k\geq1, \end{array}\right. \end{equation} $

$ \begin{equation} \beta_k^{TWYL} = \frac{g_k^T(g_k-\frac{\|g_k\|}{\|g_{k-1}\|}g_{k-1})}{(1+v\frac{|g_k^Tg_{k-1}|}{\|g_k\|\|g_{k-1}\|})\|g_{k-1}\|^2}, \end{equation} $

$ \begin{equation} \theta_k = \frac{g_k^Td_{k-1}}{\|g_{k-1}\|^2}, \end{equation} $

其中步长$ \alpha_k $满足(1.1)式.当$ g_k^Td_{k-1}>0 $时, 令参数$ v = 0 $; 否则$ v>0 $.

本文组织如下:第一部分引言介绍了本文提出算法的思路; 接下来第二部分将介绍算法的流程以及证明算法是充分下降的; 第三部分证明算法在修改的Armijo线搜索条件下是全局收敛的; 第四部分对算法进行数值实验, 并与四个相关的经典算法进行比较.

2 算法

本文所提的新方法描述如下:

步骤0   选定初始步$ x_0 $, 参数$ \rho, \delta\in(0, 1) $. $ v\geq0 $, $ \varepsilon>0 $.并令$ k = 0 $;

步骤1  计算$ g_k = \nabla f(x_k) $.$ \|g_k\|\leq \varepsilon $, 则停止计算; 否则由(1.2)–(1.4)式计算$ d_k $;

步骤2  按照(1.1)式计算步长$ \alpha_k $;

步骤3  计算$ x_{k+1} = x_k+\alpha_k d_k $, 令$ k: = k+1 $, 转步骤1.

为表示方便, 记为算法1.此外, 关于方向$ d_k $, 可以得到如下引理.

引理2.1  若$ d_k $由公式(1.2)–(1.4)定义, 则$ d_k $满足下降条件$ g_k^Td_k\leq -\|g_k\|^2 $.

  当$ k = 0 $时, 容易得$ g_0^Td_0 = -\|g_0\|^2 $; 当$ k>0 $时, 在(1.2)式两边同时与$ g_k $做内积得

由算法1的定义知, 当$ g_k^Td_{k-1}>0 $$ v = 0 $, 故

$ \begin{equation} g_k^Td_k = -\|g_k\|^2; \end{equation} $

否则$ v>0 $, 有

$ \begin{equation} g_k^Td_k\leq-\|g_k\|^2. \end{equation} $

引理2.1得证, 表明算法1在不依赖于任何搜索方式下满足充分下降条件.

3 算法的收敛性分析

本节分析算法的收敛性, 首先给出相关的引理.

引理3.1  如果$ f $是下有界的, 有$ \sum\limits_{k = 0}^\infty \alpha_k^2 \|d_k\|^2<\infty $, 从而$ \lim\limits_{k\rightarrow \infty}\alpha_kd_k = 0 $.

  由引理2.1看出算法1的搜索方向是充分下降的.此外, 从(1.1)式可以看出函数值序列$ \{f(x_k)\} $是递减的.因为$ f $是下有界的, 从式(1.1)中可以得到

因此有

$ \begin{equation} \lim\limits_{k\rightarrow \infty}\alpha_k\|d_k\| = 0.\\ \end{equation} $

引理3.1证毕.

为证明算法1的全局收敛性, 首先做出如下必要假设$ \mathrm{H} $:

($ \mathrm{H} $1) $ f(x) $在水平集$ \Omega = \{x\in R^n| f(x)\leq f(x_0)\} $是有界的.

($ \mathrm{H} $2) $ f(x) $在水平集$ \Omega $的一个邻域$ N $内连续可微, 且梯度函数$ g(x) = \nabla f(x) $满足Lipschitz条件, 即存在常数$ L>0 $, 使得

$ \begin{equation} \|g(x)-g(y)\|\leq L\|x-y\|, \; \; \forall \; x, y\in N. \end{equation} $

由于$ \{f(x_k)\} $是递减的, 由算法1产生的序列$ \{x_k\}\subset \Omega $.此外, 从假设H中可知, 存在一个常数$ \gamma_1>0 $, 使得

$ \begin{equation} \|g(x)\|\leq \gamma_1, \; \; \; \; \; \forall x \in \Omega. \end{equation} $

引理3.2  若存在$ \varepsilon >0 $, 使得对任意的$ k $

$ \begin{equation} \|g_k\|>\varepsilon, \end{equation} $

从而存在一个常数$ M>0 $, 使得对任意的$ k $

$ \begin{equation} \|d_k\|\leq M. \end{equation} $

  根据$ d_k $的公式、Cauchy-Schwarz不等式以及公式(3.2)–(3.4)有

因为$ \lim\limits_{k\rightarrow \infty}\alpha_k \|d_k\| = 0 $, 所以存在一个常数$ \gamma \in (0, 1) $, 并且对于一个正整数$ K $, 当$ k>K $时有下面不等式成立

因此, 对于任意的$ k>K $, 有

故(3.5)式成立.证毕.

现在我们给出算法1的全局收敛性.

定理3.1  若假设$ \mathrm{H} $成立, 则有

$ \begin{equation} \lim\limits_{k\rightarrow \infty} \inf \|g_k\| = 0. \end{equation} $

  反证法.假设结论不成立, 因此存在一个常数$ \varepsilon >0 $, 使得

$ \begin{equation} \|g_k\|>\varepsilon, \; \; \; \forall k. \end{equation} $

如果$ \lim\limits_{k\rightarrow \infty} \inf \alpha_k>0 $, 由(2.1)、(2.2)以及(3.1)式中得到$ \lim\limits_{k\rightarrow \infty} \inf \|g_k\| = 0 $, 与(3.7)式矛盾.如果$ \lim\limits_{k\rightarrow \infty} \inf \alpha_k = 0 $, 有一个无限集合$ N $, 使得

根据算法1中的步骤2, 当$ k\in N $充分大时, $ \rho^{-1}\alpha_k $并不满足(1.1)式, 即

$ \begin{equation} f(x_k+\rho^{-1}\alpha_kd_k)-f(x_k)>-\delta \rho^{-2}\alpha_k^2\|d_k\|^2. \end{equation} $

再利用中值定理, 以及引理3.2, (2.1)、(2.2)以及(3.2)式, 存在$ h_k\in(0, 1) $, 使得

将上面的不等式代入(3.8)式, 故当$ \; \forall k\in N $充分大时, 有

因为$ \{d_k\} $是有界的, 并且$ \lim\limits_{k\rightarrow \infty}\inf\alpha_k = 0 $, 得到$ \lim\limits_{k\in N, k\rightarrow \infty}\|g_k\| = 0 $.矛盾.故定理成立.

4 数值实验

本节展示算法的数值结果.在文献[22]中选取了29个函数, 分别比较了该文所参考的MPRP方法、WYL方法、MWYL方法、同类经典的HZ方法[23]和提出的TWYL方法的数值性能.其中MPRP、TWYL方法采用修改的Armijo线搜索(1.3), WYL、MWYL采用强Wolfe线搜索, HZ采用修改的Wolfe线搜索求步长.测试环境为Win10操作系统, Intel(R) Core(TM) i7-9750H CPU 2.60GHZ 16.0GB内存.强Wolfe线搜索相关参数: $ \rho = 0.45 $, $ \delta = 0.0002 $, $ \sigma = 0.4 $. Armijo线搜索中参数为: $ \rho = 0.5 $, $ \delta = 0.2 $.算法的终止条件准则为以下二者情形之一:

(1) $ \|g_k\|\leq\varepsilon(1+f(x_k)) $, ($ \varepsilon = 10^{-6} $);

(2) $ k>10000. $

表 1给出了29个测试函数的编号及名称.

表 1   测试函数编号及名称

NumberFunction NameNumberFunction NameNumberFunction Name
1Extended Quadratic Penaltyb QP12Extended Penaltyb3Extended Wood
4Diagonal 15Perturbed Quadratic6Diagonal 3
7Generalized Tridiagonal 18Diagonal 49Diagonal 5
10Extended Himmelblau11Extended PSC112Extended BD1 (Block Diagonal)
13Quadratic QF114Quadratic QF215Extended quadratic exponential EP1
16Extended Tridiagonal 217QUARTC (CUTE)18Partial Perturbed Quadratic
19Almost Perturbed Quadratic20Staircase 121Diagonal 7
22Diagonal 823Diagonal 924Generalized Quartic
25Full Hessian FH326SINCOS27COSINE (CUTE)
28Raydan 129Hager

新窗口打开| 下载CSV


接下来采用Dolan和More[24]提出的方法, 画出数据的性能图 14.从四个图中可以直观看到MPRP方法、WYL方法、MWYL方法、TWYL方法、HZ方法分别在算法迭代次数、函数迭代次数、梯度迭代次数以及CPU运行时间四个方面的数值结果的比较.明显可以看出本文提出的TWYL方法在这四个方面都优于MPRP方法、WYL方法、MWYL方法、HZ方法.其中, 在迭代次数以及时间性能方面MPRP方法与MWYL方法比较接近.

图 1

图 1   迭代次数


图 2

图 2   函数迭代次数


图 3

图 3   梯度迭代次数


图 4

图 4   CPU运行时间


表 2表示的是MPRP方法、WYL方法、MWYL方法、TWYL方法、HZ方法对算法迭代次数$ (k) $、目标函数迭代次数$ (q) $、梯度迭代次数$ (r) $以及算法的CPU运行时间$ (t) $四个重要指标的数值结果.其中表头中N表示函数的编号, D表示函数的维度, 加黑的数据表示五种方法中效果最好.从表中可以看出所提的新方法大部分情况下都能得到最好的结果:较少的迭代次数和CPU时间.这和性能图表现的一致.因此, 数值结果表明新算法是有效的.

表 2   数值结果

NDMPRPWYLMWYLHZTWYL
$ k $   $ q $   $ r $   $ t $$ k $   $ q $   $ r $   $ t $$ k $   $ q $   $ r $   $ t $$ k $   $ q $   $ r $   $ t $$ k $   $ q $   $ r $   $ t $
11023/24/24/0.004099/1059/3218/0.019113/106/198/0.005320/110/197/0.009916/17/17/0.0019
60021/22/22/0.012073/1151/2228/0.034218/100/181/0.007315/98/178/0.017211/12/12/0.0071
100015/16/16/0.004761/933/1804/0.034818/100/181/0.004913/86/156/0.015412/13/13/0.0042
2528/29/29/0.001629/267/504/0.002934/140/245/0.001825/129/230/0.009219/20/20/0.0012
2030/31/31/0.002491/1606/3120/0.015824/185/345/0.002324/149/271/0.010220/21/21/0.0016
10046/47/47/0.006126/254/481/0.005925/210/394/0.005221/166/308/0.009521/22/22/0.0037
3100264/265/265/0.00515768/83377/16098/0.1955270/2520/4769/0.0075322/3115/5905/0.0416169/170/170/0.0040
500220/221/221/0.01402154/28471/54787/0.2580426/3895/7363/0.0377525/5062/9596/0.0865213/214/214/0.0141
1000264/265/265/0.02987223/96621/186018/1.5034381/3500/6618/0.0566440/4268/8093/0.0871251/252/252/0.0288
41030/31/31/8.4960e-437/320/602/0.001326/144/261/8.5290e-432/145/255/0.007422/23/23/6.8260e-4
10047/48/48/0.0033210/2749/5287/0.025746/342/637/0.004550/341/629/0.014540/41/41/0.0031
100065/66/66/0.050584/968/1851/0.076570/601/1251/0.052193/898/1700/0.072164/65/65/0.0412
51039/40/40/0.001287/1845/3602/0.003537/174/310/7.8290e-444/233/419/0.009331/32/32/7.1240e-4
100128/129/129/0.0026134/1393/2651/0.0046115/851/1586/0.0031131/999/1864/0.0224102/103/103/0.0020
1000368/369/369/0.0483480/7022/13563/0.1182402/4059/7715/0.0697545/5637/10726/0.1131337/338/338/0.0465
61033/34/34/0.001428/205/381/0.001739/159/278/0.001628/130/229/0.006320/21/21/0.0012
10065/66/66/0.005869/603/1136/0.009541/272/502/0.005168/470/869/0.020645/46/46/0.0044
100080/81/81/0.066786/1026/1965/0.116966/615/1136/0.070478/756/1431/0.074471/72/72/0.0616
71053/54/54/0.003232/303/573/0.003934/195/355/0.003331/190/346/0.007330/31/31/0.0020
10040/41/41/0.0062400/7132/13863/0.438131/184/336/0.006927/164/298/0.007823/24/24/0.0034
100027/28/28/0.010430/329/627/0.029242/241/439/0.025023/140/254/0.018819/20/20/0.0078
1000023/24/24/0.0718275/5356/10436/4.755626/150/273/0.135718/111/201/0.088914/15/15/0.0452
2000023/24/24/0.168428/310/591/0.534526/148/269/0.235916/97/175/0.137112/13/13/0.0840
81038/39/39/8.1400e-41171/15199/29226/0.024974/470/865/0.001350/359/665/0.008251/52/52/7.4680e-4
50042/43/43/0.00211315/16667/32018/0.081863/398/732/0.002454/389/721/0.006257/58/58/0.0018
100039/40/40/0.00331341/16869/32396/0.218870/442/813/0.006482/584/1083/0.011762/63/63/0.0048
9103/4/4/0.00113/7/10/0.00115/11/16/9.2270e-47/22/34/0.00163/4/4/8.9320e-4
1003/4/4/0.00133/7/10/0.00105/11/16/0.00136/19/29/0.00213/4/4/0.0010
101048/49/49/0.001742/665/1287/0.002954/363/671/0.002022/165/305/0.007723/24/24/0.0012
10052/53/53/0.002046/728/1409/0.003757/383/708/0.002631/230/426/0.004523/24/24/0.0013
100056/57/57/0.005448/758/1467/0.013458/390/721/0.007427/199/368/0.005425/26/26/0.0030
1000056/57/57/0.057952/815/1577/0.183158/390/721/0.107831/232/430/0.092829/30/30/0.0517
111016/17/17/0.001415/58/100/0.001330/139/247/0.001818/79/137/0.003416/17/17/0.0011
10015/16/16/0.001615/58/100/0.001429/134/238/0.002216/72/125/0.003915/16/16/0.0014
100015/16/16/0.003913/51/88/0.004128/130/231/0.008516/72/125/0.005214/15/15/0.0035
121037/38/38/9.3410e-461/1124/2186/0.003527/119/210/0.001828/145/259/0.004627/28/28/8.0080e-4
10039/40/40/0.001365/1175/2284/0.005127/119/210/0.001230/156/279/0.005228/29/29/9.8960e-4
100039/40/40/0.003869/1223/2376/0.026128/124/219/0.003432/166/297/0.009330/31/31/0.0033
131000376/377/377/0.0308468/5435/10401/0.0547454/4196/7937/0.0429462/4369/8273/0.1338307/308/308/0.0249
100001316/1317/1317/3.15681620/24780/47939/5.62542791/33598/64404/8.22981663/20450/39234/7.60301135/1136/1136/1.6381
200002089/2090/2090/5.73603224/56220/109215/16.21224791/61853/118914/18.76392321/30644/58964/22.25031613/1614/1614/4.8625
141054/55/55/0.0011121/1623/3124/0.003838/219/399/0.001342/257/469/0.007532/33/33/8.5090e-4
600321/322/322/0.02065682/137079/268475/1.1904327/3383/6438/0.0317392/4137/7879/0.0521265/266/266/0.0185
1000396/397/397/0.0399501/7736/14970/0.0992469/5196/9922/0.0701518/5796/11071/0.0852348/349/349/0.0378
151060/61/61/0.001981/1739/3396/0.005613/115/216/9.7820e-423/231/436/0.005317/18/18/8.4180e-4
10055/56/56/0.003073/1587/3100/0.010412/106/199/0.001921/211/398/0.005216/17/17/0.0017
100049/50/50/0.012265/1426/2786/0.052011/97/182/0.004017/171/322/0.007016/17/17/0.0043
16100042/43/43/0.004233/167/300/0.006226/86/145/0.003736/142/245/0.007626/27/27/0.0034
1000036/37/37/0.033715/127/238/0.040412/43/73/0.017614/61/105/0.023315/16/16/0.0210
2000032/33/33/0.058012/76/139/0.051211/40/68/0.027010/47/81/0.031610/11/11/0.0240
10000024/25/25/0.16149/58/106/0.13099/33/56/0.08117/36/62/0.07569/10/10/0.0670
1710001/2/2/0.01491000/20044/30088/9.36731000/20001/30002/9.2210-/-/-/-1/2/2/0.0068
181035/36/36/0.003393/2021/3948/0.029127/136/244/0.002926/139/249/0.006624/25/25/0.0027
10075/76/76/0.035980/802/1523/0.090872/540/1007/0.059486/673/1257/0.140872/73/73/0.0319
1000334/335/335/51.9645348/5448/10547/98.1685389/4631/8872/78.6427491/5943/11392/90.9052305/306/306/32.5759
191036/37/37/8.8550e-489/1945/3800/0.008133/157/280/9.2980e-443/226/406/0.005031/32/32/7.7060e-4
100118/119/119/0.0059142/1620/3097/0.0110116/855/1593/0.0068128/965/1799/0.0209104/105/105/0.0051
1000365/366/366/0.0464512/7484/14455/0.1206400/4038/7675/0.0666469/4841/9210/0.0901325/326/326/0.0434
100001140/1141/1141/2.28282024/36184/70343/7.22772630/33971/65311/6.65851606/21180/40751/7.74361115/1116/1116/2.3376
2010111/112/112/0.0038645/6679/12712/0.037788/550/1011/0.0041120/795/1467/0.0197104/105/105/0.0036
50581/582/582/0.04487249/75221/143192/0.8142720/7177/13633/0.0812649/6657/12662/0.1175526/527/527/0.0433
1001389/1390/1390/1.18581461/23309/45156/1.96091580/18355/35129/1.72591436/17161/32883/0.44881096/1097/1097/0.6747
21100023/24/24/0.071326/317/607/0.02498/24/39/0.002710/41/69/0.016712/13/13/0.0040
1000030/31/31/0.001620/227/433/1.58107/21/34/0.13969/37/62/0.08549/10/10/0.1825
2000024/25/25/0.517320/226/431/3.27247/21/34/0.28939/37/62/0.12779/10/10/0.3605
2210030/31/31/0.001618/183/347/0.002511/32/54/0.001112/49/83/0.004211/12/12/0.0011
1000024/25/25/0.041812/106/199/0.08259/27/44/0.019710/41/69/0.04768/9/9/0.0189
2000023/24/24/0.071310/83/155/0.12439/27/44/0.037710/41/69/0.05908/9/9/0.0342
2310000103/104/104/0.8917101/1647/3192/1.193992/1200/2307/0.932894/1242/2387/0.836291/92/92/0.7481
2000059/60/60/0.885273/1302/2530/1.803268/895/1721/1.2813-/-/-/-58/59/59/0.8296
10000064/65/65/5.0643108/2333/4557/15.171571/1174/2276/7.5531-/-/-/-54/55/55/4.3003
241000020/21/21/0.0138137/3270/6402/0.578620/63/105/0.013830/122/211/0.060918/19/19/0.0110
1500023/24/24/0.0203137/3261/6384/1.026219/60/100/0.026131/124/214/0.091021/22/22/0.0210
2000022/23/23/0.0324137/3253/6368/1.236119/58/96/0.022831/124/214/0.106318/19/19/0.0199
255036/37/37/0.0019243/5487/10730/0.028217/113/208/0.008146/370/691/0.012718/19/19/0.0014
10037/38/38/0.0200103/2346/4588/0.031323/176/328/0.002530/270/507/0.012525/26/26/0.0095
1000449/450/450/0.204372/1683/3293/0.0947-/-/-/-29/345/658/0.035115/16/16/0.0153
261000014/15/15/0.030112/47/81/0.034228/130/231/0.100715/67/116/0.078812/13/13/0.0321
2000014/15/15/0.054612/47/81/0.069628/129/229/0.152915/67/116/0.103711/12/12/0.0477
10000013/14/14/0.226611/44/76/0.271627/126/224/0.697511/50/86/0.251111/12/12/0.2078
2710046/47/47/0.003555/656/1256/0.010325/253/480/0.0049-/-/-/-19/20/20/0.0020
100023/24/24/0.019773/1332/2590/0.141616/162/307/0.0195-/-/-/-14/15/15/0.0071
1000016/17/17/0.0686106/1118/2129/1.288515/114/212/0.1380-/-/-/-15/16/16/0.0618
281000064/65/65/0.278673/808/1542/0.435048/452/855/0.232073/712/1348/0.421958/59/59/0.2042
5000055/56/56/1.1709159/2313/4466/4.113946/527/1007/1.327786/1025/1961/3.506340/41/41/0.8828
10000045/46/46/2.089045/735/1460/3.700449/605/1160/2.863139/488/934/3.892433/34/34/1.6141
29100030/31/31/0.020925/175/324/0.016728/147/265/0.014126/142/255/0.023722/23/23/0.0099
1000021/22/22/0.107052/774/1495/0.753022/152/281/0.158621/150/276/0.132721/22/22/0.1054
2000020/21/21/0.227422/308/593/0.572719/182/344/0.338021/153/282/0.252719/20/20/0.2101

新窗口打开| 下载CSV


参考文献

Hestenes M R , Stiefel E .

Methods of conjugate gradients for solving linear systems

J Res Natl Bur Stand, 1952, 49 (6): 409- 436

DOI:10.6028/jres.049.044      [本文引用: 1]

Fletcher R , Reeves C M .

Function minimization by conjugate gradients

Comput J, 1964, 7 (2): 149- 154

DOI:10.1093/comjnl/7.2.149      [本文引用: 1]

Polak E , Ribiere G .

Note sur la convergence de méthodes de directions conjuguées

ESAIM-Math Model Num, 1969, 16 (3): 35- 43

[本文引用: 1]

Polyak B T .

The conjugate gradient method in extremal problems

USSR Comput Math Math Phys, 1969, 9 (4): 94- 112

DOI:10.1016/0041-5553(69)90035-4      [本文引用: 1]

Dai Y H , Yuan Y .

A nonlinear conjugate gradient method with a strong global convergence property

SIAM J Optimiz, 1999, 10 (1): 177- 182

DOI:10.1137/S1052623497318992      [本文引用: 1]

Powell M J D .

Convergence properties of algorithms for nonlinear optimization

SIAM Rev, 1986, 28 (4): 487- 500

DOI:10.1137/1028154      [本文引用: 1]

Glibert J C , Nocedal J .

Global convergence properties of conjugate gradient method for optimization

SIAM J Optimiz, 1992, 2 (1): 21- 42

DOI:10.1137/0802003      [本文引用: 1]

Wei Z X , Yao S W , Liu L Y .

The convergence properties of some new conjugate gradient methods

Appl Math Comput, 2006, 183 (2): 1341- 1350

URL     [本文引用: 1]

Huang H , Wei Z X , Yao S W .

The proof of the sufficient descent condition of the Wei-Yao-Liu conjugate gradient method under the strong Wolfe-Powell line search

Appl Math Comput, 2007, 189 (2): 1241- 1245

URL     [本文引用: 1]

Jiang X Z , Jian J B .

Two modified nonlinear conjugate gradient methods with disturbance factors for unconstrained optimization

Nonlinear Dyn, 2014, 77 (1/2): 387- 397

URL     [本文引用: 1]

Zhang L , Jian S Y .

Further studies on the Wei-Yao-Liu nonlinear conjugate gradient method

Appl Math Comput, 2013, 219 (14): 7616- 7621

URL    

Lu S , Wei Z X , Mo L L .

Some global convergence properties of the Wei-Yao-Liu conjugate gradient method with inexact line search

Appl Math Comput, 2011, 217 (17): 7132- 7137

URL    

Yao S W , Wei Z X , Huang H .

A note about WYLs conjugate gradient method and its application

Appl Math Comput, 2007, 191 (2): 381- 388

URL    

Zhang P , Du X W .

A hybrid PRP-WYL conjugate gradient method with the strong wolfe line search

Journal of Chongqing Normal University, 2020, 37 (1): 41- 51

[本文引用: 1]

Zhang L , Zhou W J , Li D H .

A descent modified Polak-Ribiére-Polyak conjugate gradient method and its global convergence

IMA J Numer Anal, 2006, 26 (4): 629- 640

DOI:10.1093/imanum/drl016      [本文引用: 2]

李向荣.

一个三项LS共轭梯度方法

广西科学, 2013, 20 (4): 348- 351

URL     [本文引用: 1]

Li X R .

A three-term LS conjugate gradient method

Guangxi Sciences, 2013, 20 (4): 348- 351

URL     [本文引用: 1]

Liu J K , Zhao Y X , Wu X L .

Some three-term conjugate gradient methods with the new direction structure

Appl Numer Math, 2020, 150, 433- 443

DOI:10.1016/j.apnum.2019.10.011     

Amini K , Faramarzi P , Pirfalah N .

A modified Hestenes-Stiefel conjugate gradient method with an optimal property

Optim Method Softw, 2019, 34 (4): 770- 782

DOI:10.1080/10556788.2018.1457150     

Wu Y L .

A modified three-term PRP conjugate gradient algorithm for optimization models

J Inequal Appl, 2017, 2017 (1): 1- 14

DOI:10.1186/s13660-016-1272-0     

Babaie-Kafaki S , Ghanbari R .

Two modified three-term conjugate gradient methods with sufficient descent property

Optim Lett, 2014, 8 (8): 2285- 2297

DOI:10.1007/s11590-014-0736-8      [本文引用: 1]

董晓亮, 李卫军.

一类新的WYL型共轭梯度法及其全局收敛性

河南师范大学学报(自然科学版), 2018, 46 (4): 107- 112

URL     [本文引用: 1]

Dong X L , Li W J .

Global convergence of a new Wei-Yao-Liu type conjugate gradient method

Journal of Henan Normal University(Natural Science Edition), 2018, 46 (4): 107- 112

URL     [本文引用: 1]

Neculai A .

An unconstrained optimization test functions collection

Adv Model Optim, 2008, 10 (1): 147- 161

[本文引用: 1]

Hager W W , Zhang H C .

A new conjugate gradient method with guaranteed descent and an efficient line search

SIAM J Optimiz, 2005, 16 (1): 170- 192

DOI:10.1137/030601880      [本文引用: 1]

Dlan E D , Moré J J .

Benchmarking optimization software with performance profiles

Math Program, 2001, 91 (2): 201- 213

URL     [本文引用: 1]

/