Next: About this document ...
Up: selfref
Previous: 5. ACKNOWLEDGEMENTS
 1

K. Möller and S. Thrun.
Task modularization by network modulation.
In J. Rault, editor, Proceedings of NeuroNimes '90, pages
419432, November 1990.
 2

A. J. Robinson and F. Fallside.
The utility driven dynamic error propagation network.
Technical Report CUED/FINFENG/TR.1, Cambridge University Engineering
Department, 1987.
 3

J. Schmidhuber.
A fixed size storage time complexity learning algorithm
for fully recurrent continually running networks.
Neural Computation, 4(2):243248, 1992.
 4

J. Schmidhuber.
Learning to control fastweight memories: An alternative to recurrent
nets.
Neural Computation, 4(1):131139, 1992.
 5

J. Schmidhuber.
An introspective network that can learn to run its own weight change
algorithm.
In Proc. of the Intl. Conf. on Artificial Neural Networks,
Brighton, pages 191195. IEE, 1993.
 6

J. Schmidhuber.
A neural network that embeds its own metalevels.
In Proc. of the International Conference on Neural Networks '93,
San Francisco. IEEE, 1993.
 7

R. J. Williams.
Complexity of exact gradient computation algorithms for recurrent
neural networks.
Technical Report Technical Report NUCCS8927, Boston: Northeastern
University, College of Computer Science, 1989.
 8

R. J. Williams and D. Zipser.
A learning algorithm for continually running fully recurrent
networks.
Neural Computation, 1(2):270280, 1989.
Juergen Schmidhuber
20030221
Back to Metalearning page
Back to Recurrent Neural Networks page