Acta mathematica scientia,Series B ›› 2023, Vol. 43 ›› Issue (1): 1-24.doi: 10.1007/s10473-023-0101-z

    Next Articles

A SUPERLINEARLY CONVERGENT SPLITTING FEASIBLE SEQUENTIAL QUADRATIC OPTIMIZATION METHOD FOR TWO-BLOCK LARGE-SCALE SMOOTH OPTIMIZATION*

Jinbao Jian1,†, Chen Zhang2, Pengjie Liu3   

  1. 1. College of Mathematics and Physics, Guangxi Key Laboratory of Hybrid Computation and IC Design Analysis, Center for Applied Mathematics and Artificial Intelligence, Guangxi Minzu University, Nanning 530006, China;
    2. School of Mechanical Engineering, University of Shanghai for Science and Technology, Shanghai 200093, China;
    3. School of Mathematics, China University of Mining and Technology, Xuzhou 221116, China
  • Received:2021-01-06 Revised:2022-05-25 Published:2023-03-01
  • Contact: †Jinbao JIAN. E-mail: jianjb@gxu.edu.cn
  • About author:Chen Zhang, E-mail: zhangchen_2017@126.com;Pengjie Liu, E-mail: liupengjie2019@163.com
  • Supported by:
    *National Natural Science Foundation of China (12171106), the Natural Science Foundation of Guangxi Province (2020GXNSFDA238017 and 2018GXNSFFA281007) and the Shanghai Sailing Program (21YF1430300).

Abstract: This paper discusses the two-block large-scale nonconvex optimization problem with general linear constraints. Based on the ideas of splitting and sequential quadratic optimization (SQO), a new feasible descent method for the discussed problem is proposed. First, we consider the problem of quadratic optimal (QO) approximation associated with the current feasible iteration point, and we split the QO into two small-scale QOs which can be solved in parallel. Second, a feasible descent direction for the problem is obtained and a new SQO-type method is proposed, namely, splitting feasible SQO (SF-SQO) method. Moreover, under suitable conditions, we analyse the global convergence, strong convergence and rate of superlinear convergence of the SF-SQO method. Finally, preliminary numerical experiments regarding the economic dispatch of a power system are carried out, and these show that the SF-SQO method is promising.

Key words: large scale optimization, two-block smooth optimization, splitting method, feasible sequential quadratic optimization method, superlinear convergence

Trendmd