Global Exponential Stability of a General Class of Recurrent Neural Networks with Time-Varying Delays

Zhigang Zeng, Jun Wang, Xiaoxin Liao

Результат исследований: Вклад в журналСтатьярецензирование

267 Цитирования (Scopus)

Аннотация

This brief presents new theoretical results on the global exponential stability of neural networks with time-varying delays and Lipschitz continuous activation functions. These results include several sufficient conditions for the global exponential stability of general neural networks with time-varying delays and without monotone, bounded, or continuously differentiable activation function. In addition to providing new criteria for neural networks with time-varying delays, these stability conditions also improve upon the existing ones with constant time delays and without time delays. Furthermore, it is convenient to estimate the exponential convergence rates of the neural networks by using the results.

Язык оригиналаАнглийский
Страницы (с-по)1353-1358
Число страниц6
ЖурналIEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications
Том50
Номер выпуска10
DOI
СостояниеОпубликовано - окт. 2003
Опубликовано для внешнего пользованияДа

Fingerprint

Подробные сведения о темах исследования «Global Exponential Stability of a General Class of Recurrent Neural Networks with Time-Varying Delays». Вместе они формируют уникальный семантический отпечаток (fingerprint).

Цитировать