A Fixed Size Storage O(n3) Time Complexity Learning Algorithm for Fully Recurrent Continually Running Networks
- 1 March 1992
- journal article
- Published by MIT Press in Neural Computation
- Vol. 4 (2), 243-248
- https://doi.org/10.1162/neco.1992.4.2.243
Abstract
The real-time recurrent learning (RTRL) algorithm (Robinson and Fallside 1987; Williams and Zipser 1989) requires O(n4) computations per time step, where n is the number of noninput units. I describe a method suited for on-line learning that computes exactly the same gradient and requires fixed-size storage of the same order but has an average time complexity per time step of O(n3).Keywords
This publication has 2 references indexed in Scilit:
- Learning Complex, Extended Sequences Using the Principle of History CompressionNeural Computation, 1992
- A Subgrouping Strategy that Reduces Complexity and Speeds Up Learning in Recurrent NetworksNeural Computation, 1989