Acta mathematica scientia,Series A ›› 2022, Vol. 42 ›› Issue (3): 1173-1190.doi: 10.1007/s10473-022-0321-7

• Articles • Previous Articles     Next Articles


Shuhua WANG1, Baohuai SHENG2,3   

  1. 1. School of Information Engineering, Jingdezhen Ceramic University, Jingdezhen, 333403, China;
    2. Department of Finance, Zhejiang Yuexiu University, Shaoxing, 312030, China;
    3. Department of Applied Statistics, Shaoxing University, Shaoxing, 312000, China
  • Received:2020-12-24 Revised:2021-04-20 Online:2022-06-26 Published:2022-06-24
  • Contact: Baohuai SHENG,
  • Supported by:
    This work is supported by the NSF (61877039), the NSFC/RGC Joint Research Scheme (12061160462 and N CityU 102/20) of China, the NSF (LY19F020013) of Zhejiang Province, the Special Project for Scientific and Technological Cooperation (20212BDH80021) of Jiangxi Province, the Science and Technology Project in Jiangxi Province Department of Education (GJJ211334).

Abstract: This paper considers a robust kernel regularized classification algorithm with a non-convex loss function which is proposed to alleviate the performance deterioration caused by the outliers. A comparison relationship between the excess misclassification error and the excess generalization error is provided; from this, along with the convex analysis theory, a kind of learning rate is derived. The results show that the performance of the classifier is effected by the outliers, and the extent of impact can be controlled by choosing the homotopy parameters properly.

Key words: Support vector machine, robust classification, quasiconvex loss function, learning rate, right-sided directional derivative

CLC Number: 

  • 68T05