数学物理学报, 2021, 41(3): 837-847 doi:

论文

强Wolfe线搜索下的修正PRP和HS共轭梯度法

马国栋,

Improved PRP and HS Conjugate Gradient Methods with the Strong Wolfe Line Search

Ma Guodong,

收稿日期: 2020-04-11  

基金资助: 广西自然科学基金.  2018GXNSFAA281099
国家自然科学基金.  11771383
玉林师范学院科研基金.  2019YJKY16

Received: 2020-04-11  

Fund supported: the NSF of Guangxi.  2018GXNSFAA281099
the NSFC.  11771383
the Research Project of Yulin Normal University.  2019YJKY16

作者简介 About authors

马国栋,E-mail:mgd2006@163.com , E-mail:mgd2006@163.com

Abstract

The conjugate gradient method is one of the most effective methods for solving large-scale unconstrained optimization. Combining the second inequality of the strong Wolfe line search, two new conjugate parameters are constructed. Under usual assumptions, it is proved that the improved PRP and HS conjugate gradient methods satisfy sufficient descent condition with the greater range of parameter in the strong Wolfe line search and converge globally for unconstrained optimization. Finally, two group numerical experiments for the proposed methods and their comparisons are tested, the numerical results and their corresponding performance files are reported, which show that the proposed methods are promising.

Keywords: Unconstrained optimization ; Conjugate gradient method ; Strong Wolfe line search ; Global convergence

PDF (574KB) 元数据 多维度评价 相关文章 导出 EndNote| Ris| Bibtex  收藏本文

本文引用格式

马国栋. 强Wolfe线搜索下的修正PRP和HS共轭梯度法. 数学物理学报[J], 2021, 41(3): 837-847 doi:

Ma Guodong. Improved PRP and HS Conjugate Gradient Methods with the Strong Wolfe Line Search. Acta Mathematica Scientia[J], 2021, 41(3): 837-847 doi:

1 引言

共轭梯度法是求解大规模光滑无约束优化问题$ \min\{f(x)|\ x\in {{\Bbb R}} ^n\} $的最有效方法之一, 其迭代公式的一般形式为

$ \begin{equation} x_{k+1} = x_k+\alpha_k d_k, \end{equation} $

$ \begin{eqnarray} d_k = \left\{ \begin{array} {lll}-g_k, &k = 1, \\ -g_k+\beta_kd_{k-1}, &k\geq2, \end{array}\right. \end{eqnarray} $

其中$ g_k = \nabla f(x_k) $, $ \alpha_k $为搜索步长, $ d_k $为搜索方向, $ \beta_k $为方向调控参数. 步长通常可由标准Wolfe线搜索产生, 即$ \alpha_k $满足

$ \begin{equation} \left\{\begin{array}{ll} f(x_k+\alpha_k d_k)\leq f(x_k)+\delta\alpha_k g_k^Td_k, \\ g(x_k+\alpha_k d_k)^T d_k\geq\sigma g_k^Td_k, \end{array}\right. \end{equation} $

或由强Wolfe线搜索获得

$ \begin{equation} \left\{\begin{array}{ll} f(x_k+\alpha_k d_k)\leq f(x_k)+\delta\alpha_k g_k^Td_k, \\ |g(x_k+\alpha_k d_k)^T d_k|\leq\sigma |g_k^Td_k|, \end{array}\right. \end{equation} $

其中参数$ \delta $$ \sigma $满足$ 0 < \delta < \sigma < 1 $. 不同的$ \beta_k $对应不同的共轭梯度法, 著名的$ \beta_k $计算公式有

与上述四个公式相对应的共轭梯度法分别称为HS方法[1]、FR方法[2]、PRP方法[3-4]和DY方法[5], 其中FR方法和DY方法具有良好的收敛性质, HS方法和PRP方法的数值性能优良. 因此, 为寻求收敛性和数值效果都理想的方法, 基于上述调控参数所建立的共轭梯度法均被广泛研究[6-14]. 为了确保PRP公式非负, 文献[6]提出了PRP公式的一种变形, 记为WYL参数公式

$ \begin{eqnarray} \beta_k^{\rm WYL} = \frac{g_{k}^{T}(g_{k}-\frac{\|g_k\|}{\|g_{k-1}\|}g_{k-1})}{\|g_{k-1}\|^{2}}, \end{eqnarray} $

WYL方法不仅传承了PRP方法的性质, 而且文献[7]证明了在强Wolfe搜索条件下($ \sigma\in (0, \frac14) $)全局收敛. 借鉴WYL方法的思想, 文献[8]改进了HS共轭梯度法, 称之为YWH方法, 其共轭参数公式为

$ \begin{eqnarray} \beta_k^{\rm YWH} = \frac{g_{k}^{T}(g_{k}-\frac{\|g_k\|}{\|g_{k-1}\|}g_{k-1})}{d_{k-1}^T(g_k-g_{k-1})}, \end{eqnarray} $

该方法在强Wolfe搜索条件下($ \sigma\in (0, \frac13) $)是全局收敛的. 为了扩大上述两方法的强Wolfe搜索中参数$ \sigma $的取值范围, 文献[9]对WYL公式和YWH公式进行了改进, 得到了两个新公式

$ \begin{equation} \beta_k^{\rm NPRP} = \frac{\|g_{k}\|^{2}-\frac{\|g_k\|}{\|g_{k-1}\|}|g_{k}^{T}g_{k-1}|}{\|g_{k-1}\|^{2}}, \ \ \beta_k^{\rm NHS} = \frac{\|g_{k}\|^{2}-\frac{\|g_k\|}{\|g_{k-1}\|}|g_{k}^{T}g_{k-1}|}{d_{k-1}^T(g_k-g_{k-1})}, \end{equation} $

显然, $ 0\leq\beta_k^{\rm NPRP}\leq \beta_k^{\rm FR} $$ 0\leq\beta_k^{\rm NHS}\leq \beta_k^{\rm DY} $, 并证明由新公式产生的算法不仅每步迭代都可获得充分下降的搜索方向, 且在强Wolfe线搜索条件及参数$ \sigma\in(0, \frac12) $下全局收敛. 最近, 结合强Wolfe线搜索(1.4)式的第二个不等式, 文献[13]给出了具有充分下降性的改进FR型和DY型共轭梯度法, 两个方法不仅保持了FR和DY方法的性能, 而且具有良好的收敛性质和数值效果, 其共轭参数公式分别为

$ \begin{equation} \beta_k^{\rm IFR} = \frac{\|g_{k}\|^{2}}{\|g_{k-1}\|^{2}}\cdot\frac{|g_{k}^Td_{k-1}|}{-g_{k-1}^Td_{k-1}}, \ \ \beta_k^{\rm IDY} = \frac{\|g_{k}\|^{2}}{d_{k-1}^T(g_k-g_{k-1})}\cdot\frac{|g_{k}^Td_{k-1}|}{-g_{k-1}^Td_{k-1}}. \end{equation} $

该文借鉴文献[9, 13]的思想, 为获得强Wolfe线搜索中参数$ \sigma $更大的取值范围, 对(1.7)式中两个参数公式进一步修正, 并得到如下的两个改进公式

$ \begin{eqnarray} \beta_k^{\rm IPRP} = \frac{\|g_{k}\|^{2}-\frac{\|g_k\|}{\|g_{k-1}\|}|g_{k}^{T}g_{k-1}|}{\|g_{k-1}\|^{2}} \frac{|g_{k}^T d_{k-1}|}{-g_{k-1}^T d_{k-1}}, \end{eqnarray} $

$ \begin{eqnarray} \beta_k^{\rm IHS} = \frac{\|g_{k}\|^{2}-\frac{\|g_k\|}{\|g_{k-1}\|}|g_{k}^{T}g_{k-1}|}{d_{k-1}^T(g_k-g_{k-1})} \frac{|g_{k}^T d_{k-1}|}{-g_{k-1}^T d_{k-1}}, \end{eqnarray} $

显然, $ 0\leq\beta_k^{\rm IPRP}\leq \sigma\beta_k^{\rm FR} $$ 0\leq\beta_k^{\rm IHS}\leq \sigma\beta_k^{\rm DY} $. 每一次迭代中, 由新公式所产生的两个共轭梯度法都可获得充分下降的搜索方向, 利用强Wolfe线搜索获得搜索步长, 并证明由公式$ \beta_k^{\rm IPRP} $$ \beta_k^{\rm IHS} $所产生的改进方法分别在参数$ \sigma\in(0, \frac{\sqrt{2}}2) $$ \sigma\in(0, 1) $下是全局收敛的. 最后, 对所提出的新方法进行数值测试, 并与其它同类方法进行比对, 数值结果表明该文所提出的两个方法是有效的.

2 算法描述

基于公式(1.9)和(1.10), 建立该文算法框架如下.

初始化. 任取初始点$ {x_1}\in{{{\Bbb R}}^n} $, 给定精度$ \varepsilon > 0 $, 参数$ 0 < \delta < \sigma < 1 $, $ d_1 = -g_1, k = 1 $.

步骤1   若$ {\|g_k\|} < \varepsilon $, 则停止. 否则, 转入步骤2.

步骤2   由强Wolfe线搜索准则产生步长$ \alpha_k $, 即$ \alpha_k $满足不等式组(1.4).

步骤3   令$ {x_{k+1} = {x_k}+{\alpha_k}{d_k}} $, 计算梯度$ g_{k+1}: = g(x_{k+1}) $, 利用(1.9)或(1.10)式获得参数公式$ \beta_{k+1} $, 即$ \beta_{k+1}: = \beta_{k+1}^{\rm IPRP} $$ \beta_{k+1}: = \beta_{k+1}^{\rm IHS} $.

步骤4   令$ d_{k+1} = -g_{k+1}+\beta_{k+1}d_k $.$ k: = k+1 $, 返回步骤1.

为便于分析, 记IPRP方法和IHS方法分别为由参数公式$ \beta_k^{\rm IPRP} $$ \beta_k^{\rm IHS} $所产生的新算法.

3 IPRP方法的性质及其全局收敛性

为获得算法的收敛性, 首先给出算法所需的两个常规假设条件.

(H1) 函数$ f(x) $在水平集$ \Lambda = \{x\in R^n|f(x)\leqslant f(x_1)\} $上有下界, 其中$ x_1 $为算法初始点.

(H2) 函数$ f(x) $在水平集$ \Lambda $的某邻域$ U $内可微, 且其梯度函数$ g(x) = \nabla f(x) $满足Lipschitz条件, 即存在常数$ L > 0 $, 使$ \|g(x)-g(y)\|\leq L\|x-y\|, \forall\ x, y\in U $.

下面引理所给出的是著名Zoutendijk条件[15], 其在共轭梯度法全局收敛性分析中起着非常重要的作用.

引理3.1   设假设(H1)和(H2)成立, 考虑一般迭代方法(1.1)–(1.2), 若搜索方向$ d_k $满足$ g_k^Td_k < 0 $, 步长因子$ \alpha_k $ 满足标准Wolfe线搜索准则, 则有$ \sum\limits_{k = 1}^\infty\frac{(g_k^Td_k)^2}{{\|d_k\|}^2} < \infty $成立.

对于无约束优化问题, 若存在常数$ c > 0 $, 使得迭代点列$ \{x_k\} $及相应的搜索方向序列$ \{d_{k}\} $满足

$ \begin{equation} g_k^Td_k\leq -c\|g_{k}\|^{2}, \ \ \forall\ k, \end{equation} $

称搜索方向$ d_k $满足充分下降条件.

接下来, 将分析IPRP算法的全局收敛性. 为此, 首先证明IPRP算法所产生的搜索方向在强Wolfe线搜索准则下均为充分下降的.

引理3.2  设假设(H2)成立, 且方向$ d_{k} $由IPRP共轭梯度法产生. 若参数$ \sigma $满足$ 0 < \sigma < \frac{\sqrt{2}}{2} $, 则对所有的$ \forall\ k\geq 1 $, 总有下式成立

$ \begin{eqnarray} -\frac{1}{1-\sigma^{2}} \leq \frac{g_k^Td_k}{\|g_{k}\|^2}\leq-\frac{1-2\sigma^{2}}{1-\sigma^{2}}. \end{eqnarray} $

  当$ k = 1 $时, 有$ g_1^Td_1 = -\|g_1\|^2 $, 结论显然成立. 假设结论(3.2)对$ k-1 $($ k\geq2) $成立. 下面证明其对$ k $也成立. 记梯度向量$ g_k $$ g_{k-1} $之间的夹角为$ \theta_k $, 易知

$ \begin{equation} \|g_{k}\|^{2}-\frac{\|g_k\|}{\|g_{k-1}\|}|g_{k}^{T}g_{k-1}| = \|g_k\|^2-\|g_k\|^2|\cos\theta_{k}| = \|g_k\|^2(1-|\cos\theta_{k}|). \end{equation} $

利用(1.2)和(1.9)式, 立得

$ \begin{eqnarray} \frac{g_{k}^Td_{k}}{\|g_{k}\|^2} & = &-1+\frac{\|g_{k}\|^{2}-\frac{\|g_k\|}{\|g_{k-1}\|}|g_{k}^{T}g_{k-1}|}{\|g_{k-1}\|^{2}} \frac{|g_{k}^T d_{k-1}|}{-g_{k-1}^T d_{k-1}}\frac{g_k^Td_{k-1}}{\|g_{k}\|^2}{}\\ & = &-1+\frac{\|g_k\|^2(1-|\cos\theta_{k}|)}{\|g_{k-1}\|^2}\frac{|g_{k}^T d_{k-1}|}{-g_{k-1}^T d_{k-1}}\frac{g_k^Td_{k-1}}{\|g_{k}\|^2}. \end{eqnarray} $

考虑到强Wolfe线搜索(1.4)的第二个不等式和归纳假设, 有

$ \begin{equation} |g^T_kd_{k-1}|\leq \sigma |g_{k-1}^Td_{k-1}| = -\sigma g_{k-1}^Td_{k-1}, \ \hbox{ 或}\ \frac{|g^T_kd_{k-1}|}{-g_{k-1}^Td_{k-1}}\leq \sigma. \end{equation} $

将(3.5)式代入(3.4)式, 可得

$ \begin{equation} -1+\sigma^2 \frac{g_{k-1}^Td_{k-1}}{\|g_{k-1}\|^2}\leq-1-\sigma\frac{|g_k^Td_{k-1}|}{\|g_{k-1}\|^2}\leq \frac{g_k^Td_{k}}{\|g_{k}\|^2} \leq -1+\sigma\frac{|g_k^Td_{k-1}|}{\|g_{k-1}\|^2}\leq -1-\sigma^2\frac{g_{k-1}^Td_{k-1}}{\|g_{k-1}\|^2}. \end{equation} $

进而, 对(3.6)式利用$ k-1 $的归纳假设, 不难得到

因此, 关系式(3.2)对所有的$ k\geq1 $都成立. 证毕.

由引理3.2可知, 当$ 0 < \sigma < \frac{\sqrt{2}}{2} $时, 记$ c\triangleq\frac{1-2\sigma^{2}}{1-\sigma^{2}} $, 由IPRP方法所产生的方向均满足充分下降条件(3.1). 下面证明IPRP方法满足Glibert和Nocedal在文献[14]中所给出的性质$ (\ast) $.

引理3.3  设假设(H1)和(H2)成立, 对任意的$ k\geq1 $, 若满足$ 0 < \gamma\leq\|g_{k}\|\leq\bar{\gamma} $, 则IPRP方法在强Wolfe线搜索下满足性质$ (\ast) $.

  结合强Wolfe线搜索, (1.9)和(3.3)式, 再利用$ 0 < \gamma\leq\|g_{k}\|\leq\bar{\gamma} $$ 0 < \sigma < \frac{\sqrt2}{2} $, 有

$ b = \frac{\bar{\gamma}^2}{\gamma^2} > 1 $$ \lambda = \frac{\gamma^4}{2\sqrt{2}L\bar{\gamma}^3} $, 则$ \beta_k^{\rm IPRP}\leq b $. 进而, 令$ \|s_{k-1}\| = \|\alpha_kd_k\|\leq\lambda $, 考虑到共轭参数公式(1.9), 假设(H1)和(H2), 可得

证毕.

基于引理3.1, 3.2和3.3, 可建立IPRP方法的全局收敛性定理.

定理3.1  设假设(H1)和(H2)成立, 迭代点列$ \{x_k\} $由(1.1)和(1.2)式产生, 步长$ \alpha_k $满足强Wolfe线搜索准则(1.4), $ \beta_k $ 由(1.9)式产生, 则$ \lim\limits_{k\rightarrow \infty}\inf \|g_k\| = 0 $, 即IPRP方法是全局收敛的.

4 IHS方法的性质及其全局收敛性

下面将分析IHS方法的搜索方向在强Wolfe线搜索下满足充分下降条件(3.1).

引理4.1   设假设(H2)成立, 且搜索方向$ d_{k} $由IHS方法产生. 若参数满足$ 0 < \sigma < 1 $, 则

$ \begin{eqnarray} g_k^Td_k\leq-(1-\sigma)\|g_{k}\|^2, \ \forall \ k\geq1. \end{eqnarray} $

进而, 关系式$ 0\leq\beta_{k}^{\rm IHS}\leq\frac{g_k^Td_k}{g_{k-1}^Td_{k-1}} $ 成立.

  利用数学归纳法证明. 当$ k = 1 $时, 有

假设对$ k-1 $有结论$ g_{k-1}^Td_{k-1}\leq-(1-\sigma)\|g_{k-1}\|^2 $成立. 往证其对$ k $的情形也成立. 由(1.2), (1.10), (3.3)和(3.5)式, 有

$ \begin{eqnarray} g_k^Td_k& = &g_k^T(-g_k+\beta_{k}^{\rm IHS}d_{k-1}){}\\ & = &-\|g_k\|^2+\frac{\|g_{k}\|^{2}{\rm(1-|cos \theta_k|)}}{d_{k-1}^T(g_k-g_{k-1})}\cdot \frac{|g_{k}^T d_{k-1}|}{-g_{k-1}^T d_{k-1}}g_{k}^{T}d_{k-1}{}\\ & = &-\|g_k\|^2+\|g_{k}\|^{2}{\rm(1-|cos \theta_k|)}\frac{d_{k-1}^T(g_k-g_{k-1})+(g_{k-1}^{T}d_{k-1})}{d_{k-1}^T(g_k-g_{k-1})}\cdot\frac{|g_{k}^T d_{k-1}|}{(-g_{k-1}^{T}d_{k-1})}{}\\ &\leq&-(1-\sigma)\|g_k\|^2+\frac{\|g_{k}\|^{2}{\rm(1-|cos \theta_k|)}}{d_{k-1}^T(g_k-g_{k-1})}\cdot \frac{|g_{k}^T d_{k-1}|}{-g_{k-1}^T d_{k-1}}g_{k-1}^{T}d_{k-1}. \end{eqnarray} $

此外, 由归纳假设易知$ g_{k-1}^{T}d_{k-1} < 0 $. 这与强Wolfe非精确线搜索(1.4)的第二个不等式表明$ d_{k-1}^T(g_k-g_{k-1}) > 0 $. 因此, 由(4.2)式不难得到

进一步, 利用(3.3)和(4.2)式, 可得

$ \begin{equation} {\large\begin{array}{ll}g_k^Td_k\leq \frac{\|g_{k}\|^{2}(1-|\cos \theta_k|)}{d_{k-1}^T(g_k-g_{k-1})}\cdot \frac{|g_{k}^T d_{k-1}|}{-g_{k-1}^T d_{k-1}}g_{k-1}^{T}d_{k-1} = \beta_{k}^{\rm IHS}(g_{k-1}^{T}d_{k-1}).\end{array}} \end{equation} $

将(4.3)式的两边同除以$ g_{k-1}^{T}d_{k-1} $, 有$ 0\leq\beta_{k}^{\rm IHS}\leq\frac{g_k^Td_k}{g_{k-1}^Td_{k-1}} $成立. 证毕.

最后, 可建立IHS方法的全局收敛性定理.

定理4.1   设假设(H1)和(H2)成立, 迭代点列$ \{x_k\} $由(1.1)和(1.2)式产生, 步长$ \alpha_k $满足强Wolfe线搜索准则(1.4), $ \beta_k $ 由(1.10)式产生, 则$ \lim\limits_{k\rightarrow \infty}\inf \|g_k\| = 0 $, 即IHS方法是全局收敛的.

  由反证法. 若定理不成立, 注意到$ \|g_k\| > 0 $, 则存在常数$ \tilde{\gamma} > 0 $使得$ \|g_k\|^2\geq\tilde{\gamma}, \ \forall\ k. $ 由(1.2)式, 立得$ d_k+g_k = \beta_k^{\rm IHS}d_{k-1} $. 对上式两端取平方, 再结合引理4.1, 有

$ \begin{eqnarray} \|d_k\|^2& = &(\beta_k^{\rm IHS})^2\|d_{k-1}\|^2-2g_k^Td_k-\|g_k\|^2{}\\ & \leq&\left(\frac{g_k^Td_k}{g_{k-1}^Td_{k-1}}\right)^2\|d_{k-1}\|^2-2g_k^Td_k-\|g_k\|^2. \end{eqnarray} $

将(4.4)式的两端同除以$ (g_k^Td_k)^2 $, 得

$ \begin{eqnarray} \frac{\|d_k\|^2}{(g_k^Td_k)^2}&\leq&\frac{\|d_{k-1}\|^2}{(g_{k-1}^Td_{k-1})^2}-\frac{2}{g_k^Td_k}-\frac{\|g_k\|^2}{(g_k^Td_k)^2} {}\\& = &\frac{\|d_{k-1}\|^2}{(g_{k-1}^Td_{k-1})^2}-(\frac1{\|g_k\|}+\frac{\|g_k\|}{g_k^Td_k})^2+\frac1{\|g_k\|^2}{}\\ & \leq&\frac{\|d_{k-1}\|^2}{(g_{k-1}^Td_{k-1})^2}+\frac1{\|g_k\|^2}. \end{eqnarray} $

注意到$ \large\frac{\|d_1\|^2}{(g_1^Td_1)^2} = \frac1{\|g_1\|^2} $, 由(4.5)式和$ \|g_k\|^2\geq\tilde{\gamma} $, 有

$ \begin{eqnarray} \frac{\|d_k\|^2}{(g_k^Td_k)^2}&\leq&\frac{\|d_{k-1}\|^2}{(g_{k-1}^Td_{k-1})^2}+\frac1{\|g_k\|^2}{}\\ &\leq&\frac{\|d_{k-2}\|^2}{(g_{k-2}^Td_{k-2})^2}+\frac1{\|g_{k-1}\|^2}+\frac1{\|g_k\|^2}{}\\ &\leq&\cdots \leq\sum\limits_{i = 1}\limits^k\frac{1}{\|g_i\|^2} \leq \frac{k}{\tilde{\gamma}}, \end{eqnarray} $

这意味着$ \frac{(g_k^Td_k)^2}{\|d_k\|^2}\geq\frac{\tilde{\gamma}}{k} $, 进而有

这与引理3.1的Zoutendijk条件矛盾. 证毕.

5 数值试验

为检验该文所提出的IPRP和IHS方法的实际数值效果, 两种方法的数值试验都测试了64个问题, 所有算例均取自于无约束优化测试问题集[16-17], 测试算例的规模从2到50000不等. 为便于比对数值效果, 将进行两组算法的数值试验, 即: 第一组为该文算法IPRP、算法FR[2]、算法WYL[6]、算法VPRP[9]和算法IFR[13]; 第二组为该文算法IHS、算法DY[5]、算法YWH[8]、算法VHS[9]和算法IDY[13]. 测试的环境为MATLAB R2017b, Windows 10操作系统, 计算机硬件为Inter(R) Core(TM) i5-8250U CPU 1.80 GHz和8 GB RAM. 所有测试都采用强Wolfe线搜索准则(1.4)获得步长, 参数选取为$ \delta = 0.01 $$ \sigma = 0.1 $, 算法的终止准则为以下两种情形之一: (1) $ \|g_k\| < 10^{-5}; $ (2) 迭代次数大于1000次. 若终止准则(2)在试验中出现, 则认为该方法对这个数值例子失效, 并记为"F".

在试验中, 分别对所测试算法的迭代次数(Itr), 目标函数函数值计算次数(NF), 梯度计算次数(NG), 计算时间(Tcpu) (单位为秒)4个重要指标进行观测和比较, 并列出算法终止时目标函数梯度的2-范数($ \|g_k\| $), 数值结果详见表 1表 2. 为进一步直观刻划试验效果, 将采用Dolan和Moré[18]性能图进行比对. 图 14分别就算法IPRP、算法VPRP、算法WYL、算法IFR和算法FR在迭代次数、函数值计算次数、梯度计算次数及计算时间进行比较. 图 58分别就算法IHS、算法VHS、算法YWH、算法IDY和算法DY在迭代次数、函数值计算次数、梯度计算次数及计算时间进行比较.

表 1   第一组方法数值试验报告

序号算例IPRPVPRPWYLIFRFR
算例名/维数Itr/NF/NG/Tcpu/||gk||Itr/NF/NG/Tcpu/||gk||Itr/NF/NG/Tcpu/||gk||Itr/NF/NG/Tcpu/||gk||Itr/NF/NG/Tcpu/||gk||
1bdexp 103/1/3/0.005/2.27e-483/1/3/0.001/6.25e-493/1/3/0.000/6.25e-493/1/3/0.000/1.06e-48F/F/F/F/F
2bdexp 1003/2/3/0.000/1.33e-823/2/3/0.000/1.24e-823/2/3/0.000/1.24e-823/2/3/0.000/1.23e-824/4/4/0.001/8.22e-10
3bdexp 10003/2/3/0.001/4.45e-1073/2/3/0.001/4.45e-1073/2/3/0.002/4.45e-1073/2/3/0.001/4.41e-1073/2/3/0.001/3.25e-65
4bdexp 100003/2/3/0.009/1.14e-1093/2/3/0.009/1.14e-1093/2/3/0.009/1.14e-1093/2/3/0.009/1.13e-1093/2/3/0.007/5.74e-104
5bdexp 200003/2/3/0.019/1.07e-1093/2/3/0.019/1.07e-1093/2/3/0.018/1.07e-1093/2/3/0.019/1.07e-1093/2/3/0.022/8.65e-107
6exdenschnb 618/333/142/0.027/5.87e-0621/424/199/0.031/2.28e-06F/F/F/F/F92/2469/1261/0.120/7.46e-06F/F/F/F/F
7exdenschnb 818/333/142/0.016/6.77e-0621/424/198/0.020/2.63e-06F/F/F/F/F81/2181/1132/0.103/9.46e-06F/F/F/F/F
8himmelbg 2003/6/7/0.001/7.14e-293/6/7/0.001/7.12e-293/6/7/0.001/7.12e-293/6/7/0.001/7.13e-293/6/7/0.001/2.78e-27
9himmelbg 10003/6/7/0.002/1.60e-283/6/7/0.001/1.59e-283/6/7/0.001/1.59e-283/6/7/0.001/1.59e-283/6/7/0.001/6.23e-27
10himmelbg 20003/6/7/0.002/2.26e-283/6/7/0.002/2.25e-283/6/7/0.002/2.25e-283/6/7/0.002/2.26e-283/6/7/0.002/8.81e-27
11himmelbg 50003/6/7/0.006/3.57e-283/6/7/0.004/3.56e-283/6/7/0.004/3.56e-283/6/7/0.005/3.57e-283/6/7/0.003/1.39e-26
12genquartic 100026/527/272/0.051/8.90e-0649/1193/591/0.087/1.72e-06F/F/F/F/F135/3820/1894/0.262/7.72e-06F/F/F/F/F
13genquartic 200024/465/195/0.041/3.68e-0740/829/402/0.071/3.32e-07F/F/F/F/F63/1527/759/0.152/4.42e-0680/1963/985/0.166/8.99e-06
14genquartic 300028/533/256/0.074/9.56e-0739/819/394/0.110/4.38e-06130/3558/1746/0.438/2.90e-06109/2817/1389/0.343/3.04e-0660/14 36/699/0.198/1.29e-06
15biggsb1 547/1099/508/0.060/3.03e-0651/1260/580/0.058/3.30e-0683/2167/1067/0.096/9.24e-06101/2486/1235/0.114/3.20e-06116/3006/1485/0.151/1.63e-06
16biggsb1 1072/1771/868/0.097/3.99e-06110/3065/1354/0.176/6.47e-06321/9146/4502/0.464/6.68e-06165/4546/2275/0.249/7.11e-06260/73 83/3699/0.394/7.49e-06
17sinqua d 3211/6309/2454/0.327/5.45e-06141/3934/1661/0.167/8.82e-07F/F/F/F/F209/5635/2820/0.292/1.67e-06F/F/F/F/F
18fletcbv3 101/1/1/0.000/5.97e-061/1/1/0.000/5.97e-061/1/1/0.000/5.97e-061/1/1/0.000/5.97e-061/1/1/0.000/5.97e-06
19fletcbv3 20123/2561/1320/0.158/5.54e-06207/3854/1963/0.230/9.67e-06144/3777/1938/0.208/9.27e-06144/3549/1786/0.173/9.18e-06235/6092/3110/0.287/8.70e-06
20nonscomp 5069/1550/742/0.090/5.32e-06350/10319/4581/0.511/8.98e-06332/9490/4661/0.474/8.39e-06F/F/F/F/FF/F/F/F/F
21dixmaana 150015/159/64/0.217/9.53e-0716/263/115/0.249/2.16e-0656/1370/656/1.247/7.33e-0654/1387/654/1.258/6.30e-0663/1542/738/1.411/8.05e-06
22dixmaan b 150011/140/50/0.145/2.91e-0611/140/50/0.120/2.90e-0665/1613/757/1.566/6.91e-0650/1278/623/1.139/5.80e-06F/F/F/F/F
23dixmaanc 150060/1667/793/1.578/7.39e-0621/430/199/0.406/3.25e-06F/F/F/F/F141/4104/2008/3.680/1.69e-0649/1260/611/1.156/8.49e-06
24dixmaand 150063/1757/855/1.670/6.73e-06F/F/F/F/F59/1419/655/1.251/8.17e-0675/1811/888/1.692/2.60e-0654/1303/644/1.186/8.75e-06
25dixon3dq 20521/14809/6503/0.737/4.61e-06403/11099/5010/0.493/9.85e-06F/F/F/F/F411/11043/5496/0.490/8.31e-06F/F/F/F/F
26dqdrtic 100089/2094/957/0.135/7.15e-06123/3254/1469/0.195/7.20e-06337/8603/4223/0.506/8.01e-06145/3860/1933/0.236/7.33e-06F/F/F/F/F
27dqdrtic 3000115/2569/1149/0.318/2.11e-06130/3633/1604/0.394/8.77e-06F/F/F/F/F267/7418/3623/0.882/4.49e-06348/9565/4759/1.167/5.93e-06
28dqrtic 5017/226/94/0.019/2.87e-0623/340/152/0.025/9.97e-0730/571/257/0.039/3.97e-0637/718/334/0.047/3.17e-0642/847/422/0.058/3.39e-06
29dqrtic 10023/292/117/0.024/9.22e-0737/740/350/0.059/4.05e-0639/834/412/0.069/4.43e-06F/F/F/F/F56/1250/607/0.100/2.99e-06
30dqrtic 15023/345/158/0.035/5.20e-0627/473/221/0.047/7.25e-0641/890/411/0.087/7.32e-06F/F/F/F/F55/1239/607/0.124/9.64e-06
31edensch 10038/805/372/0.075/6.94e-0637/747/358/0.066/4.94e-06F/F/F/F/FF/F/F/F/FF/F/F/F/F
32edensch 20039/837/402/0.099/7.72e-0641/894/409/0.103/3.57e-06586/17937/8632/2.164/6.27e-0690/2221/1066/0.257/2.92e-06203/5577/2766/0.698/6.38e-06
33edensch 100053/1251/618/0.559/8.40e-0657/1412/678/0.604/6.60e-06212/6028/2941/2.500/9.89e-06F/F/F/F/F121/3187/1554/1.336/9.97e-06
34fletchcr 1092/2330/1063/0.122/4.66e-06137/3824/1720/0.198/5.62e-06181/4817/2375/0.239/9.63e-06130/3342/1649/0.161/8.36e-06F/F/F/F/F
35fletchcr 10063/1570/739/0.072/4.99e-06126/3389/1561/0.153/6.47e-06200/5116/2552/0.272/2.95e-06136/3723/1839/0.194/8.70e-06F/F/F/F/F
36liarwhd 1085/2053/931/0.126/4.38e-06118/3128/1461/0.153/6.45e-0667/1637/822/0.079/7.80e-06143/3622/1726/0.174/1.86e-06F/F/F/F/F
37liarwhd 1085/2053/931/0.102/4.38e-06118/3128/1461/0.152/6.45e-0667/1637/822/0.079/7.80e-06143/3622/1726/0.175/1.86e-06F/F/F/F/F
38liarwhd 2054/1201/548/0.059/2.51e-0683/2005/946/0.096/3.67e-06F/F/F/F/F112/2678/1341/0.132/6.28e-06F/F/F/F/F
39penalt y1 100014/243/93/0.755/2.34e-0614/243/93/0.753/2.34e-0620/440/186/1.402/8.67e-0714/243/93/0.702/2.34e-06F/F/F/F/F
40penalt y1 200010/129/39/1.282/5.53e-0610/129/39/1.295/5.53e-0619/421/183/4.409/1.67e-0713/220/81/2.262/7.43e-0714/233/82/2.331/1.04e-06
41power1 30390/11133/4828/0.479/7.51e-06F/F/F/F/FF/F/F/F/F554/14994/7484/0.661/7.38e-06F/F/F/F/F
42power1 50974/27796/12081/1.271/7.98e-06F/F/F/F/FF/F/F/F/F844/22960/11507/1.063/7.28e-06F/F/F/F/F
43quartc 2017/218/97/0.015/1.93e-0625/436/214/0.024/3.39e-0632/624/296/0.034/5.70e-0621/369/172/0.022/9.11e-0621/348/170/0.023/5.01e-06
44quartc 10023/292/117/0.034/9.22e-0737/740/350/0.075/4.05e-0639/834/412/0.068/4.43e-06F/F/F/F/F56/1250/607/0.099/2.99e-06
45tridia 590/2208/1021/0.131/7.47e-0691/2347/1098/0.111/5.04e-06233/6288/3116/0.288/9.52e-06132/3694/1822/0.167/9.45e-06178/4808 /2347/0.224/8.08e-06
46raydan2 100011/173/64/0.025/6.93e-0611/173/59/0.015/6.97e-0613/235/98/0.019/5.88e-0611/173/64/0.015/6.93e-06F/F/F/F/F
47raydan2 500015/270/119/0.068/8.63e-0712/204/71/0.052/8.98e-06F/F/F/F/F13/235/81/0.071/7.52e-06F/F/F/F/F
48raydan2 900013/235/89/0.119/3.57e-0614/236/80/0.122/8.17e-07F/F/F/F/F12/204/73/0.105/9.68e-06F/F/F/F/F
49diagonal 1 12105/3134/1528/0.152/6.47e-06105/3134/1528/0.138/6.46e-06178/5045/2412/0.254/8.59e-06123/3267/1606/0.175/2.63e-06154/4290/2074/0.188/9.47e-06
50diagonal 2 20110/3291/1615/0.167/7.74e-07112/3352/1629/0.152/1.21e-06177/4865/2381/0.234/8.10e-0691/2436/1199/0.109/7.14e-06158/4381/2169/0.195/6.73e-06
51diagonal2 10086/2332/1156/0.125/9.74e-06138/3993/1881/0.226/5.03e-06258/7007/3440/0.367/5.18e-06142/3889/1927/0.228/4.61e-06314/8540/4265/0.457/3.56e-06
52diagonal 3 2094/2642/1291/0.153/4.23e-0692/2518/1239/0.173/6.97e-06198/5426/2744/0.254/7.35e-06136/3671/1808/0.169/8.58e-06297/8635/4233/0.389/8.19e-06
53diagonal 3 4078/1848/864/0.088/8.04e-0670/1662/798/0.082/2.75e-06330/9184/4561/0.436/8.35e-06F/F/F/F/F166/4212/2153/0.221/7.77e-06
54bv 10001/1/1/0.000/4.99e-061/1/1/0.000/4.99e-061/1/1/0.000/4.99e-061/1/1/0.000/4.99e-061/1/1/0.000/4.99e-06
55bv 100001/1/1/0.000/5.00e-081/1/1/0.000/5.00e-081/1/1/0.000/5.00e-081/1/1/0.000/5.00e-081/1/1/0.000/5.00e-08
56ie 5012/193/77/0.255/6.88e-0611/166/63/0.178/4.87e-0642/1006/516/1.155/2.93e-0630/598/294/0.676/7.83e-0649/1249/599/1.258/9.32e-06
57ie 20014/225/98/3.642/3.11e-0611/166/64/2.631/9.62e-0658/1475/739/24.086/9.66e-0642/961/446/15.469/6.32e-0637/834/424/13.787/4.12e-06
58gauss 38/142/65/0.018/3.85e-068/142/64/0.014/3.78e-0635/912/446/0.082/2.70e-065/51/16/0.004/2.43e-0627/699/363/0.055/3.86e-06
59kowosb 4403/11443/5023/0.678/5.35e-06501/13830/6067/0.789/6.93e-06F/F/F/F/F241/6791/3346/0.387/8.22e-06F/F/F/F/F
60lin 5002/2/2/0.021/9.93e-142/2/2/0.018/9.93e-142/2/2/0.020/9.93e-142/2/2/0.016/9.93e-142/2/2/0.016/9.93e-14
61rosex 50454/13379/5609/0.780/9.65e-06518/14946/6508/1.015/8.36e-06F/F/F/F/FF/F/F/F/FF/F/F/F/F
62trid 20101/2542/1164/0.206/4.07e-06111/2785/1293/0.253/4.46e-06717/20092/10002/1.691/9.73e-06173/4652/2343/0.360/9.95e-06449/12935/6383/1.034/9.91e-06
63vardim 510/137/43/0.011/5.41e-0710/137/43/0.008/5.41e-0713/228/96/0.013/9.27e-0710/137/43/0.008/5.41e-0714/228/88/0.017/5.31e-07
64watson 4106/2780/1261/0.345/9.27e-06190/5428/2430/0.585/9.92e-06F/F/F/F/F138/3688/1822/0.368/9.18e-06F/F/F/F/F

新窗口打开| 下载CSV


表 2   第二组方法数值试验报告

序号算例IHSVHSYWHIDYDY
算例名/维数Itr/NF/NG/Tcpu/||gk||Itr/NF/NG/Tcpu/||gk||Itr/NF/NG/Tcpu/||gk||Itr/NF/NG/Tcpu/||gk||Itr/NF/NG/Tcpu/||gk||
1bdexp 103/1/3/0.001/2.25e-483/1/3/0.001/5.63e-493/1/3/0.001/5.63e-493/1/3/0.001/9.95e-493/1/3/0.000/8.27e-54
2bdexp 1003/2/3/0.001/1.33e-823/2/3/0.001/1.24e-823/2/3/0.001/1.24e-823/2/3/0.001/1.22e-823/2/3/0.001/2.02e-83
3bdexp 10003/2/3/0.002/4.45e-1073/2/3/0.001/4.45e-1073/2/3/0.001/4.45e-1073/2/3/0.001/4.40e-1073/2/3/0.001/3.49e-107
4bdexp 100003/2/3/0.012/1.14e-1093/2/3/0.009/1.14e-1093/2/3/0.006/1.14e-1093/2/3/0.005/1.13e-1093/2/3/0.005/1.11e-109
5bdexp 200003/2/3/0.015/1.07e-1093/2/3/0.017/1.07e-1093/2/3/0.019/1.07e-1093/2/3/0.020/1.07e-1093/2/3/0.018/1.06e-109
6exdenschnb 627/575/270/0.031/6.48e-0746/1205/572/0.060/5.91e-0685/2160/1080/0.102/9.61e-06F/F/F/F/F45/1082/500/0.049/2.80e-06
7exdenschnb 827/575/270/0.035/7.48e-0746/1205/581/0.070/6.83e-0682/2086/1040/0.099/9.94e-06F/F/F/F/F45/1082/521/0.050/3.23e-06
8himmelbg 2003/6/7/0.001/7.14e-293/6/7/0.001/7.12e-293/6/7/0.001/7.12e-293/6/7/0.001/7.13e-293/6/7/0.001/6.93e-29
9himmelbg 10003/6/7/0.001/1.60e-283/6/7/0.001/1.59e-283/6/7/0.001/1.59e-283/6/7/0.001/1.59e-283/6/7/0.001/1.55e-28
10himmelbg 20003/6/7/0.001/2.26e-283/6/7/0.001/2.25e-283/6/7/0.001/2.25e-283/6/7/0.001/2.25e-283/6/7/0.001/2.19e-28
11himmelbg 50003/6/7/0.003/3.57e-283/6/7/0.003/3.56e-283/6/7/0.003/3.56e-283/6/7/0.004/3.57e-283/6/7/0.004/3.47e-28
12genquartic 20021/337/151/0.021/2.29e-0638/872/413/0.058/5.45e-0692/2398/1194/0.124/6.62e-0663/1571/737/0.087/3.97e-06821/19378 /9592/1.049/9.80e-06
13genquartic 50030/599/289/0.033/5.23e-0636/712/351/0.047/4.93e-0665/1501/760/0.112/1.08e-0678/1989/962/0.114/9.02e-06645/17776/8774/1.104/9.29e-06
14genquartic 100026/497/224/0.034/2.70e-0632/582/258/0.038/4.41e-06149/4040/1993/0.282/6.24e-06F/F/F/F/F918/25177/12373/1.652/9.83e-06
15biggsb1 1092/2354/1072/0.137/8.90e-0699/2590/1194/0.130/4.88e-0693/2353/1157/0.118/6.10e-06F/F/F/F/F300/7576/3703/0.363/9.64e-06
16biggsb1 20223/6267/2797/0.328/4.70e-06231/6441/2952/0.310/9.38e-06141/3413/1740/0.158/9.96e-06F/F/F/F/F585/15571/7647/0.817/6.94e-06
17fletcbv3 101/1/1/0.000/5.97e-061/1/1/0.000/5.97e-061/1/1/0.000/5.97e-061/1/1/0.000/5.97e-061/1/1/0.000/5.97e-06
18nonscomp 5071/1796/850/0.118/6.23e-0649/988/455/0.047/4.92e-06F/F/F/F/FF/F/F/F/FF/F/F/F/F
19dixmaana 150015/177/73/0.187/8.55e-0613/184/61/0.156/5.51e-06F/F/F/F/F18/267/110/0.237/2.08e-0625/423/187/0.394/2.57e-06
20dixmaan b 150011/140/44/0.147/2.91e-0611/140/53/0.128/2.90e-0618/316/132/0.297/9.49e-0610/117/35/0.097/7.97e-0613/172/69/0.155/5.31e-07
21dixmaanc 150020/324/142/0.330/1.63e-0624/400/169/0.369/4.89e-07F/F/F/F/F29/579/278/0.526/2.10e-06F/F/F/F/F
22dixmaand 150020/303/137/0.316/9.11e-0618/297/129/0.272/7.43e-0659/1312/656/1.246/7.34e-06F/F/F/F/FF/F/F/F/F
23dixmaanf 1500384/10668/4805/9.600/6.61e-06328/8716/3884/7.877/6.86e-06368/9331/4633/8.518/9.40e-06F/F/F/F/F569/13296/6208/12.037/9.45e-06
24dixmaang 1500462/12937/5721/11.405/6.91e-06349/9644/4160/8.895/9.67e-06251/6303/3170/5.904/7.43e-06F/F/F/F/FF/F/F/F/F
25dixmaanh 1500340/8992/4020/8.074/9.24e-06443/12355/5798/11.331/6.07e-06253/6592/3315/6.061/9.63e-06F/F/F/F/F545/12688/6055/11.547/9.72e-06
26dqdrtic 500112/2969/1315/0.169/4.71e-06808/24158/10585/1.257/6.11e-06234/5935/2961/0.308/8.73e-06F/F/F/F/F332/7838/3899/0.402/8.00e-06
27dqdrtic 1000448/12853/5663/0.829/6.15e-06677/20184/8818/1.191/7.38e-06195/5120/2524/0.303/6.75e-06F/F/F/F/F356/9315/4555/0.572/9.28e-06
28dqdrtic 5000211/5452/2496/0.871/9.23e-06786/23433/10290/3.794/7.65e-06329/9479/4559/1.520/7.36e-06F/F/F/F/F355/9103/4407/1.442/9.96e-06
29edensch 20040/923/459/0.120/3.76e-0637/773/344/0.090/3.17e-0699/2586/1330/0.303/2.67e-06F/F/F/F/FF/F/F/F/F
30edensch 50048/1121/538/0.261/5.01e-0648/1121/507/0.250/9.24e-06F/F/F/F/FF/F/F/F/F646/14842/7011/3.463/9.95e-06
31fletchcr 2096/2510/1184/0.133/3.12e-06122/3213/1517/0.168/4.57e-06100/2485/1271/0.143/5.84e-06F/F/F/F/F720/19851/9712/0.967/9.81e-06
32fletchcr 50108/2811/1292/0.143/9.45e-06111/2937/1382/0.131/8.33e-0670/1702/811/0.104/1.88e-06F/F/F/F/F353/9180/4634/0.447/8.43e-06
33genrose 40000394/11155/4844/14.998/1.15e-06484/13780/6083/18.593/9.07e-06617/16168/8104/22.158/6.56e-06F/F/F/F/FF/F/F/F/F
34genrose 50000239/6311/2783/10.298/6.13e-06307/8594/3772/13.857/3.30e-06726/18954/9375/33.155/4.74e-06F/F/F/F/FF/F/F/F/F
35liarwhd 1075/1884/846/0.104/7.13e-0692/2380/1068/0.121/9.74e-0681/1932/948/0.098/1.58e-06F/F/F/F/F169/4333/2143/0.236/7.07e-06
36liarwhd 2091/2350/1048/0.123/1.68e-06147/3927/1783/0.204/4.36e-0671/1791/870/0.101/1.15e-06F/F/F/F/F158/3925/1941/0.244/2.74e-06
37penalt y1 100014/243/93/0.791/2.34e-0614/243/93/0.758/2.34e-0619/406/179/1.333/9.37e-0719/406/177/1.367/9.37e-0714/243/93/0.750/2.34e-06
38penalt y1 200010/129/39/1.354/5.53e-0610/129/39/1.301/5.53e-0618/385/157/4.370/3.61e-0818/385/157/4.268/3.61e-0810/129/39/1.302/5.53e-06
39penalt y1 500011/152/50/11.744/3.01e-0611/152/50/11.720/3.01e-0610/156/50/12.004/4.03e-0610/156/50/11.701/4.03e-0611/152/51/11.610/3.01e-06
40quartc 2012/131/50/0.012/1.48e-0615/193/77/0.013/6.78e-0725/465/214/0.027/7.65e-0627/554/243/0.031/4.83e-0642/907/438/0.049/8.77e-06
41quartc 10021/268/112/0.034/2.55e-0622/298/107/0.042/5.43e-06F/F/F/F/F40/835/412/0.077/7.91e-06F/F/F/F/F
42tridia 1093/2351/1061/0.138/6.02e-06160/4450/1973/0.215/5.81e-06143/3824/1885/0.193/6.63e-06F/F/F/F/F509/13202/6489/0.659/6.52e-06
43tridia 30243/6517/2942/0.339/7.01e-06244/6428/2942/0.362/8.75e-06229/6007/2989/0.310/9.35e-06F/F/F/F/FF/F/F/F/F
44raydan 1 5054/1229/560/0.067/4.85e-0665/1361/679/0.071/1.69e-0691/2358/1158/0.114/5.16e-06F/F/F/F/F685/15740/7582/0.776/9.90e-06
45raydan 1 8069/1547/712/0.072/8.33e-0698/2546/1222/0.123/8.71e-06107/2819/1366/0.139/2.74e-06F/F/F/F/F602/13450/6380/0.789/9.59e-06
46raydan2 100011/173/64/0.018/6.93e-0611/173/66/0.014/6.95e-0611/173/65/0.014/6.93e-0611/173/65/0.014/6.95e-0611/173/65/0.027/6.95e-06
47raydan2 400012/204/79/0.055/9.01e-0613/235/109/0.062/4.97e-0615/268/96/0.063/4.95e-0716/301/98/0.067/1.75e-0615/271/92/0.079/7.57e-07
48raydan2 1000013/208/79/0.135/6.37e-0914/236/90/0.139/2.02e-0614/238/92/0.155/5.95e-0715/272/115/0.157/1.62e-0612/204/72/0.126/8.83e-06
49diagonal 1 50156/4632/2239/0.252/5.98e-06168/5006/2447/0.314/9.21e-06106/2632/1279/0.134/5.76e-06F/F/F/F/FF/F/F/F/F
50diagonal2 200168/4488/2112/0.294/6.79e-06232/6717/3155/0.423/5.53e-06156/3915/1904/0.264/3.42e-06F/F/F/F/F901/23693/11745/1.642/9.54e-06
51diagonal2 800468/13834/6665/1.469/9.51e-06460/13514/6589/1.395/4.10e-06297/8009/3930/0.846/7.15e-06F/F/F/F/FF/F/F/F/F
52diagonal 3 582/2438/1160/0.146/4.57e-0682/2438/1151/0.130/4.58e-0690/2387/1194/0.115/7.49e-0674/1924/927/0.106/6.19e-06322/8250 /4083/0.460/9.78e-06
53diagonal 3 20107/3017/1401/0.150/8.29e-06117/3398/1613/0.174/7.60e-0691/2344/1142/0.131/8.12e-06F/F/F/F/F542/13718/6653/0.701/9.80e-06
54bv 10001/1/1/0.000/4.99e-061/1/1/0.000/4.99e-061/1/1/0.000/4.99e-061/1/1/0.000/4.99e-061/1/1/0.000/4.99e-06
55bv 100001/1/1/0.000/5.00e-081/1/1/0.000/5.00e-081/1/1/0.000/5.00e-081/1/1/0.000/5.00e-081/1/1/0.000/5.00e-08
56ie 10012/193/85/0.894/9.38e-0611/166/75/0.729/2.85e-0670/1873/917/8.274/4.81e-06F/F/F/F/F71/1838/904/7.827/9.92e-06
57ie 20014/225/89/3.598/2.89e-0611/166/67/2.651/4.02e-06102/2876/1443/48.512/3.88e-0615/207/104/3.495/2.82e-0662/1551/711/25.058/5.32e-06
58singx 10174/4588/2004/0.307/2.52e-06618/18316/7284/1.186/8.99e-06334/8361/4166/0.620/7.59e-06F/F/F/F/F731/19522/9716/1.219/9.55e-06
59singx 150175/4802/2117/0.758/6.53e-06720/21252/8241/3.513/9.69e-06601/15530/7775/2.721/3.25e-06F/F/F/F/F228/5911/2866/1.009/4.27e-06
60beale 2134/3736/1642/0.201/7.28e-06203/5839/2717/0.305/3.43e-06F/F/F/F/FF/F/F/F/FF/F/F/F/F
61froth 2933/26993/11996/1.634/9.76e-06F/F/F/F/FF/F/F/F/FF/F/F/F/F210/6330/3026/0.396/9.06e-06
62lin 1002/2/2/0.007/3.13e-142/2/2/0.004/3.13e-142/2/2/0.003/3.13e-142/2/2/0.002/3.13e-142/2/2/0.002/3.13e-14
63lin 5002/2/2/0.027/9.93e-142/2/2/0.019/9.93e-142/2/2/0.019/9.93e-142/2/2/0.016/9.93e-142/2/2/0.024/9.93e-14
64trid 120963/28247/12176/6.211/6.81e-06F/F/F/F/F461/11845/5899/2.515/6.45e-06F/F/F/F/FF/F/F/F/F

新窗口打开| 下载CSV


图 1

图 1   迭代次数比较


图 2

图 2   函数值计算次数比较


图 3

图 3   梯度计算次数比较


图 4

图 4   计算时间比较


图 5

图 5   迭代次数比较


图 6

图 6   函数值计算次数比较


图 7

图 7   梯度计算次数比较


图 8

图 8   计算时间比较


图 18可以更直观地看出, 算法IPRP和算法IHS在所考查的4个指标中均明显优于其他同类8个算法. 因此, 就数值试验的64个测试问题而言, 本文所提出的方法不仅在解决问题的个数上占据优势, 并且鲁棒性最优. 综上所述, 本文所修正的算法是有效的.

参考文献

Hestenes M R , Stiefel E .

Method of conjugate gradient for solving linear system

J Res Nat Bur Stand, 1952, 49, 409- 436

DOI:10.6028/jres.049.044      [本文引用: 1]

Fletcher R , Reeves C M .

Function minimization by conjugate gradients

Comput J, 1964, 7, 149- 154

DOI:10.1093/comjnl/7.2.149      [本文引用: 2]

Polak E , Ribiére G .

Note sur la convergence de méthodes de directions conjugées

Rev Fr Inform Rech Oper, 1969, 16, 35- 43

URL     [本文引用: 1]

Polyak B T .

The conjugate gradient method in extreme problems

USSR Comput Math Math Phys, 1969, 9, 94- 112

DOI:10.1016/0041-5553(69)90035-4      [本文引用: 1]

Dai Y H , Yuan Y X .

A nonlinear conjugate gradient method with a strong global convergence property

SIAM J Optim, 1999, 10 (1): 177- 182

DOI:10.1137/S1052623497318992      [本文引用: 2]

Wei Z X , Yao S W , Liu L Y .

The convergence properties of some new conjugate gradient methods

Appl Math Comput, 2006, 183, 1341- 1350

URL     [本文引用: 3]

Huang H , Wei Z X , Yao S W .

The proof of the sufficient descent condition of the Wei-Yao-Liu conjugate gradient method under the strong Wolfe-Powell line search

Appl Math Comput, 2007, 189, 1241- 1245

URL     [本文引用: 1]

Yao S W , Wei Z X , Huang H .

A note about WYL's conjugate gradient method and its applications

Appl Math Comput, 2007, 191, 381- 388

URL     [本文引用: 2]

Zhang L .

An imporoved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation

Appl Math Comput, 2009, 215, 2269- 2274

[本文引用: 4]

Dai Y H , Kou C X .

A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search

SIAM J Optim, 2013, 23, 296- 320

DOI:10.1137/100813026     

李向利, 师娟娟, 董晓亮.

一类修正的非单调谱共轭梯度法及其在非负矩阵分解中的应用

数学物理学报, 2018, 38A (5): 954- 962

DOI:10.3969/j.issn.1003-3998.2018.05.012     

Li X L , Shi J J , Dong X L .

A class of modified non-monotonic spectral conjugate gradient method and applications to non-negative matrix factorization

Acta Math Sci, 2018, 38A (5): 954- 962

DOI:10.3969/j.issn.1003-3998.2018.05.012     

江羡珍, 简金宝, 马国栋.

具有充分下降性的两个共轭梯度法

数学学报(中文版), 2014, 57 (2): 365- 372

URL    

Jiang X Z , Jian J B , Ma G D .

Two conjugate gradient methods with sufficient descent property

Acta Math Sin (Chin Series), 2014, 57 (2): 365- 372

URL    

Jiang X Z , Jian J B .

Improved Fletcher-Reeves and Dai-Yuan conjugate gradient methods with the strong Wolfe line search

J Comput Appl Math, 2019, 328, 525- 534

URL     [本文引用: 4]

Glibert J C , Nocedal J .

Global covergence properties of conjugate gradient method for optimization

SIAM J Optim, 1992, 2, 21- 42

DOI:10.1137/0802003      [本文引用: 2]

Zoutendijk G. Nonlinear Programming, Computational Methods//Abadie J. Integer and Nonlinear Programming. Amsterdam: North-holland, 1970: 37-86

[本文引用: 1]

Moré J J , Garbow B S , Hillstrome K E .

Testing unconstrained optimization software

ACM Trans Math Softw, 1981, 7, 17- 41

DOI:10.1145/355934.355936      [本文引用: 1]

Bongartz I , Conn A R , Gould N I M , Toint P L .

CUTE: constrained and unconstrained testing environments

ACM Trans Math Softw, 1995, 21, 123- 160

DOI:10.1145/200979.201043      [本文引用: 1]

Dolan E D , Moré J J .

Benchmarking optimization software with performance profiles

Math Program, 2002, 91, 201- 213

DOI:10.1007/s101070100263      [本文引用: 1]

/