Absolute exponential stability of recurrent neural networks with Lipschitz-continuous activation functions and time delays
- 30 April 2004
- journal article
- research article
- Published by Elsevier BV in Neural Networks
- Vol. 17 (3), 379-390
- https://doi.org/10.1016/j.neunet.2003.08.007
Abstract
No abstract availableKeywords
This publication has 31 references indexed in Scilit:
- An additive diagonal-stability condition for absolute exponential stability of a general class of neural networksIEEE Transactions on Circuits and Systems I: Regular Papers, 2001
- Absolute exponential stability of neural networks with a general class of activation functionsIEEE Transactions on Circuits and Systems I: Regular Papers, 2000
- A proof of Kaszkurewicz and Bhaya's conjecture on absolute stability of neural networks in two-neuron caseIEEE Transactions on Circuits and Systems I: Regular Papers, 2000
- A simple proof of a necessary and sufficient condition for absolute stability of symmetric neural networksIEEE Transactions on Circuits and Systems I: Regular Papers, 1998
- A comment on "Comments on 'Necessary and sufficient condition for absolute stability of neural networks'"IEEE Transactions on Circuits and Systems I: Regular Papers, 1998
- A comment on "Comments on 'Necessary and sufficient condition for absolute stability of neural networks'"IEEE Transactions on Circuits and Systems I: Regular Papers, 1998
- A note on neural networks with multiple equilibrium pointsIEEE Transactions on Circuits and Systems I: Regular Papers, 1996
- Necessary and sufficient condition for absolute stability of neural networksIEEE Transactions on Circuits and Systems I: Regular Papers, 1994
- On a class of globally stable neural circuitsIEEE Transactions on Circuits and Systems I: Regular Papers, 1994
- On the application of degree theory to the analysis of resistive nonlinear networksInternational Journal of Circuit Theory and Applications, 1977