Next: About this document ...
Up: selfref
Previous: 5. ACKNOWLEDGEMENTS
- 1
-
K. Möller and S. Thrun.
Task modularization by network modulation.
In J. Rault, editor, Proceedings of Neuro-Nimes '90, pages
419-432, November 1990.
- 2
-
A. J. Robinson and F. Fallside.
The utility driven dynamic error propagation network.
Technical Report CUED/F-INFENG/TR.1, Cambridge University Engineering
Department, 1987.
- 3
-
J. Schmidhuber.
A fixed size storage time complexity learning algorithm
for fully recurrent continually running networks.
Neural Computation, 4(2):243-248, 1992.
- 4
-
J. Schmidhuber.
Learning to control fast-weight memories: An alternative to recurrent
nets.
Neural Computation, 4(1):131-139, 1992.
- 5
-
J. Schmidhuber.
An introspective network that can learn to run its own weight change
algorithm.
In Proc. of the Intl. Conf. on Artificial Neural Networks,
Brighton, pages 191-195. IEE, 1993.
- 6
-
J. Schmidhuber.
A neural network that embeds its own meta-levels.
In Proc. of the International Conference on Neural Networks '93,
San Francisco. IEEE, 1993.
- 7
-
R. J. Williams.
Complexity of exact gradient computation algorithms for recurrent
neural networks.
Technical Report Technical Report NU-CCS-89-27, Boston: Northeastern
University, College of Computer Science, 1989.
- 8
-
R. J. Williams and D. Zipser.
A learning algorithm for continually running fully recurrent
networks.
Neural Computation, 1(2):270-280, 1989.
Juergen Schmidhuber
2003-02-21
Back to Metalearning page
Back to Recurrent Neural Networks page