Next: About this document ...
Up: SIMPLIFYING NEURAL NETS BY
Previous: A.3. RELATION TO HINTON
- 1
-
I. Guyon, V. Vapnik, B. Boser, L. Bottou, and S. A. Solla.
Structural risk minimization for character recognition.
In D. S. Lippman, J. E. Moody, and D. S. Touretzky, editors, Advances in Neural Information Processing Systems 4, pages 471-479. Morgan
Kaufmann, 1992.
- 2
-
B. Hassibi and D. G. Stork.
Second order derivatives for network pruning: Optimal brain surgeon.
In D. S. Lippman, J. E. Moody, and D. S. Touretzky, editors, Advances in Neural Information Processing Systems 5, pages 164-171. Morgan
Kaufmann, 1993.
- 3
-
G. E. Hinton and D. van Camp.
Keeping neural networks simple.
In Proceedings of the International Conference on Artificial
Neural Networks, Amsterdam, pages 11-18. Springer, 1993.
- 4
-
S. Hochreiter and J. Schmidhuber.
Flat minimum search finds simple nets.
Technical Report FKI-200-94, Fakultät für Informatik,
Technische Universität München, 1994.
- 5
-
S. B. Holden.
On the Theory of Generalization and Self-Structuring in Linearly
Weighted Connectionist Networks.
PhD thesis, Cambridge University, Engineering Department, 1994.
- 6
-
D. J. C. MacKay.
A practical Bayesian framework for backprop networks.
Neural Computation, 4:448-472, 1992.
- 7
-
M. F. Mller.
Exact calculation of the product of the Hessian matrix of
feed-forward network error functions and a vector in O(N) time.
Technical Report PB-432, Computer Science Department, Aarhus
University, Denmark, 1993.
- 8
-
S. J. Nowlan and G. E. Hinton.
Simplifying neural networks by soft weight sharing.
Neural Computation, 4:173-193, 1992.
- 9
-
B. A. Pearlmutter and R. Rosenfeld.
Chaitin-Kolmogorov complexity and generalization in neural
networks.
In D. S. Lippman, J. E. Moody, and D. S. Touretzky, editors, Advances in Neural Information Processing Systems 3, pages 925-931. Morgan
Kaufmann, 1991.
- 10
-
J. Schmidhuber.
Discovering problem solutions with low Kolmogorov complexity and
high generalization capability.
Technical Report FKI-194-94, Fakultät für Informatik,
Technische Universität München, 1994.
Short version in A. Prieditis and S. Russell, eds., Machine Learning:
Proceedings of the Twelfth International Conference, Morgan Kaufmann
Publishers, pages 488-496, San Francisco, CA, 1995.
- 11
-
V. Vapnik.
Principles of risk minimization for learning theory.
In D. S. Lippman, J. E. Moody, and D. S. Touretzky, editors, Advances in Neural Information Processing Systems 4, pages 831-838. Morgan
Kaufmann, 1992.
- 12
-
C. Wang, S. S. Venkatesh, and J. S. Judd.
Optimal stopping and effective machine complexity in learning.
In Advances in Neural Information Processing Systems 6. Morgan
Kaufmann, 1994.
To appear.
- 13
-
P. M. Williams.
Bayesian regularisation and pruning using a laplace prior.
Technical report, School of Cognitive and Computing Sciences,
University of Sussex, Falmer, Brighton, 1994.
- 14
-
D. H. Wolpert.
Bayesian backpropagation over i-o functions rather than weights.
In J. D. Cowan, G. Tesauro, and J. Alspector, editors, Advances
in Neural Information Processing Systems 6, pages 200-207. Morgan Kaufmann,
1994.
Juergen Schmidhuber
2003-02-25
Back to Financial Forecasting page