NEURAL NETS FOR FINANCE
LOW-COMPLEXITY NEURAL NETWORKS FOR STOCK MARKET PREDICTION
Many of our machine learning algorithms, in one way or another,
discover and exploit initially unknown
environmental regularities. Regularity
implies algorithmic compressibility - inductive
learning and generalization are closely related to
data compression.
Our most lucrative application is the prediction of financial data.
The optimal (but not necessarily practically feasible) universal
way of predicting finance data is discussed
here.
An approach that is more feasible on current computers
is based on a "minimum description length"-based argument.
It shows that flat minima of typical neural network error functions
correspond to low expected overfitting/high generalization.
In stock market prediction benchmarks,
Flat Minimum Search
outperformed other widely used competitors.
-
Sepp Hochreiter
and J. Schmidhuber.
Flat Minima.
Neural Computation, 9(1):1-42, 1997, (201 K).
HTML.
- S. Hochreiter and J. Schmidhuber.
Simplifying neural nets by discovering flat minima.
In G. Tesauro, D. S. Touretzky and T. K. Leen, eds.,
Advances in Neural Information Processing Systems 7, NIPS'7,
pages 529-536.
MIT Press, Cambridge MA, 1995.
PDF .
HTML.
|