Next: EXPERIMENT 3 - stock
Up: EXPERIMENTAL RESULTS
Previous: EXPERIMENT 1 - noisy
EXPERIMENT 2 - recurrent nets.
Time-varying inputs.
The method works for continually running
fully recurrent nets as well.
At every time step,
a recurrent net with
sigmoid activations in
sees an input vector from a stream
of randomly chosen input vectors from the set
.
The task is to switch on the first output unit whenever
an input had occurred two time steps ago,
and to switch on the second output unit without delay
in response to any input .
The task can be solved by a single hidden unit.
Non-weight-decay-like results.
With conventional recurrent net algorithms,
after training,
both hidden units were used to store the input vector.
Not so with our new approach.
We trained 20 networks. All of them
learned perfect solutions.
Like with weight decay,
most weights to the output decayed to zero.
But unlike with weight decay,
strong inhibitory connections (-30.0) switched off
one of the hidden units, effectively
pruning it away.
Parameters:
Learning rate: 0.1.
Architecture: (2-2-2).
Number of training examples: 1,500.
.
See section 5.6 for parameters common to all experiments.
Next: EXPERIMENT 3 - stock
Up: EXPERIMENTAL RESULTS
Previous: EXPERIMENT 1 - noisy
Juergen Schmidhuber
2003-02-13
Back to Financial Forecasting page