## Abstract

This paper presents new results on the absolute exponential stability (AEST) of neural networks with a general class of partially Lipschitz continuous and monotone increasing activation functions under a mild condition that the interconnection matrix T of the network system is additively diagonally stable; i.e., for any positive diagonal matrix D _{1}, there exists a positive diagonal matrix D _{2} such that D _{2}(T - D _{1}) + (T - D _{1}) ^{T} D _{2} is negative definite. This result means that the neural networks with additively diagonally stable interconnection matrices are guaranteed to be globally exponentially stable for any neuron activation functions in the above class, any constant input vectors and any other network parameters. The additively diagonally stable interconnection matrices include diagonally semistable ones and H-matrices with nonpositive diagonal elements as special cases. The obtained AEST result substantially extends the existing ones in the literature on absolute stability (ABST) of neural networks. The additive diagonal stability condition is shown to be necessary and sufficient for AEST of neural networks with two neurons. Summary and discussion of the known results about ABST and AEST of neural networks are also given.

Original language | English |
---|---|

Pages (from-to) | 1308-1317 |

Number of pages | 10 |

Journal | IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications |

Volume | 48 |

Issue number | 11 |

DOIs | |

Publication status | Published - Nov 2001 |

Externally published | Yes |

## Keywords

- Absolute exponential stability
- Additive diagonal stability
- Diagonal semistability
- Global exponential stability
- H-matrix
- Neural networks
- Partial Lipschitz continuity