Generalized Lyapunov approach for convergence of neural networks with discontinuous or non-Lipschitz activations
Top Cited Papers
- 19 January 2006
- journal article
- Published by Elsevier BV in Physica D: Nonlinear Phenomena
- Vol. 214 (1), 88-99
- https://doi.org/10.1016/j.physd.2005.12.006
Abstract
No abstract availableKeywords
This publication has 25 references indexed in Scilit:
- Absolute exponential stability of recurrent neural networks with Lipschitz-continuous activation functions and time delaysNeural Networks, 2004
- Global convergence of neural networks with discontinuous neuron activationsIEEE Transactions on Circuits and Systems I: Regular Papers, 2003
- Global robust stability of delayed neural networksIEEE Transactions on Circuits and Systems I: Regular Papers, 2003
- Absolute exponential stability of a class of continuous-time recurrent neural networksIEEE Transactions on Neural Networks, 2003
- An additive diagonal-stability condition for absolute exponential stability of a general class of neural networksIEEE Transactions on Circuits and Systems I: Regular Papers, 2001
- Global exponential stability of neural networks with globally Lipschitz continuous activations and its application to linear variational inequality problemIEEE Transactions on Neural Networks, 2001
- New conditions for global stability of neural networks with application to linear and quadratic programming problemsIEEE Transactions on Circuits and Systems I: Regular Papers, 1995
- On a class of globally stable neural circuitsIEEE Transactions on Circuits and Systems I: Regular Papers, 1994
- Convergent activation dynamics in continuous time networksNeural Networks, 1989
- Stability of analog neural networks with delayPhysical Review A, 1989