Acta mathematica scientia,Series A ›› 2021, Vol. 41 ›› Issue (5): 1574-1584.

Previous Articles    

On Stochastic Accelerated Gradient with Convergence Rate of Regression Learning

Yiyuan Cheng1,Xingxing Zha1,*(),Yongquan Zhang2   

  1. 1 School of Mathematics and Statistics, Chaohu University, Hefei 238024
    2 School of Data Sciences, Zhejiang University of Finance & Economics, Hangzhou 310018
  • Received:2020-04-21 Online:2021-10-26 Published:2021-10-08
  • Contact: Xingxing Zha
  • Supported by:
    the NSFC(61573324);the Natural Science Research Project in Anhui Province(KJ2018A0455);the Program in the Youth Elite Support Plan in Universities of Anhui Province(gxyq2019082);the Fund of Chaohu University(XLY-201903)


This paper studies the regression learning problem from given sample data by using stochastic approximation (SA) type algorithm, namely, the accelerated SA.We focus on problems without strong convexity, for which all well known algorithms achieve a convergence rate for function values of $O(1/n)$. We consider and analyze accelerated SA algorithm that achieves a rate of $O(1/n)$ for classical least square regression and logistic regression problems respectively. Comparing with the well known results, we only need fewer conditions to obtain the tight convergence rate for least square regression and logistic regression problems.

Key words: Least-square regression, Logistic regression, Convergence rate

CLC Number: 

  • O174.13