next up previous
Next: INTRODUCTION

A FIXED SIZE STORAGE $O(n^3)$ TIME COMPLEXITY LEARNING ALGORITHM FOR FULLY RECURRENT CONTINUALLY RUNNING NETWORKS
(Neural Computation, 4(2):243-248, 1992)

Jürgen Schmidhuber1
Technische Universität München

Abstract:

The RTRL algorithm for fully recurrent continually running networks [Robinson and Fallside, 1987][Williams and Zipser, 1989] requires $O(n^4)$ computations per time step, where $n$ is the number of non-input units. I describe a method suited for on-line learning which computes exactly the same gradient and requires fixed-size storage of the same order but has an average time complexity2per time step of $O(n^3)$.





Juergen Schmidhuber 2003-02-13

Back to Recurrent Neural Networks page