数学物理学报 ›› 2021, Vol. 41 ›› Issue (5): 1574-1584.

• 论文 • 上一篇    

随机加速梯度算法的回归学习收敛速度

程一元1,查星星1,*(),张永全2   

  1. 1 巢湖学院数学与统计学院 合肥 238024
    2 浙江财经大学数据科学学院 杭州 310018
  • 收稿日期:2020-04-21 出版日期:2021-10-26 发布日期:2021-10-08
  • 通讯作者: 查星星 E-mail:cyymath@163.com
  • 基金资助:
    国家自然科学基金(61573324);安徽省高校自然科学研究项目(KJ2018A0455);安徽省高校青年人才支持基金(gxyq2019082);巢湖学院校级科研基金(XLY-201903)

On Stochastic Accelerated Gradient with Convergence Rate of Regression Learning

Yiyuan Cheng1,Xingxing Zha1,*(),Yongquan Zhang2   

  1. 1 School of Mathematics and Statistics, Chaohu University, Hefei 238024
    2 School of Data Sciences, Zhejiang University of Finance & Economics, Hangzhou 310018
  • Received:2020-04-21 Online:2021-10-26 Published:2021-10-08
  • Contact: Xingxing Zha E-mail:cyymath@163.com
  • Supported by:
    the NSFC(61573324);the Natural Science Research Project in Anhui Province(KJ2018A0455);the Program in the Youth Elite Support Plan in Universities of Anhui Province(gxyq2019082);the Fund of Chaohu University(XLY-201903)

摘要:

该文考虑两个经典监督学习问题(即最小二乘和logistic回归)的随机逼近.在损失函数假设非强凸性基础上,减弱了梯度的Lipschitz连续条件,提出了两种加速随机梯度算法.通过对大多数现有工作中的经验风险(期望)的非渐近分析,得到该算法的收敛速度为$O(1/n)$,其中$n$是样本数量.与已知的结果相比,只需要较少的条件就可以得到最小二乘回归和logistic回归问题的收敛速度.

关键词: 最小二乘回归, 逻辑回归, 收敛速度

Abstract:

This paper studies the regression learning problem from given sample data by using stochastic approximation (SA) type algorithm, namely, the accelerated SA.We focus on problems without strong convexity, for which all well known algorithms achieve a convergence rate for function values of $O(1/n)$. We consider and analyze accelerated SA algorithm that achieves a rate of $O(1/n)$ for classical least square regression and logistic regression problems respectively. Comparing with the well known results, we only need fewer conditions to obtain the tight convergence rate for least square regression and logistic regression problems.

Key words: Least-square regression, Logistic regression, Convergence rate

中图分类号: 

  • O174.13