数学物理学报(英文版) ›› 2015, Vol. 35 ›› Issue (5): 1122-1136.doi: 10.1016/S0252-9602(15)30044-8

• 论文 • 上一篇    下一篇

STOCHASTIC STABILITY OF UNCERTAIN RECURRENT NEURAL NETWORKS WITH MARKOVIAN JUMPING PARAMETERS

M. SYED ALI   

  1. Department of Mathematics, Thiruvalluvar University, Vellore, Tamilnadu, India
  • 收稿日期:2013-12-16 修回日期:2014-09-08 出版日期:2015-09-01 发布日期:2015-09-01
  • 作者简介:M. SYED ALI, E-mail: syedgru@gmail.com
  • 基金资助:

    The work was supported by NBHM project grant No.2/48(10)/2011-RD-II/865.

STOCHASTIC STABILITY OF UNCERTAIN RECURRENT NEURAL NETWORKS WITH MARKOVIAN JUMPING PARAMETERS

M. SYED ALI   

  1. Department of Mathematics, Thiruvalluvar University, Vellore, Tamilnadu, India
  • Received:2013-12-16 Revised:2014-09-08 Online:2015-09-01 Published:2015-09-01
  • Supported by:

    The work was supported by NBHM project grant No.2/48(10)/2011-RD-II/865.

摘要:

In this paper, global robust stability of uncertain stochastic recurrent neural networks with Markovian jumping parameters is considered. A novel Linear matrix inequality(LMI) based stability criterion is obtained to guarantee the asymptotic stability of uncertain stochastic recurrent neural networks with Markovian jumping parameters. The results are derived by using the Lyapunov functional technique, Lipchitz condition and S-procuture. Finally, numerical examples are given to demonstrate the correctness of the theoretical results. Our results are also compared with results discussed in [31] and [34] to show the effectiveness and conservativeness.

关键词: Lyapunov functional, linear matrix inequality, Markovian jumping parameters, recurrent neural networks

Abstract:

In this paper, global robust stability of uncertain stochastic recurrent neural networks with Markovian jumping parameters is considered. A novel Linear matrix inequality(LMI) based stability criterion is obtained to guarantee the asymptotic stability of uncertain stochastic recurrent neural networks with Markovian jumping parameters. The results are derived by using the Lyapunov functional technique, Lipchitz condition and S-procuture. Finally, numerical examples are given to demonstrate the correctness of the theoretical results. Our results are also compared with results discussed in [31] and [34] to show the effectiveness and conservativeness.

Key words: Lyapunov functional, linear matrix inequality, Markovian jumping parameters, recurrent neural networks

中图分类号: 

  • 34K20