... inputs1
Recently I became aware that Don Mathis had some related ideas (personal communication). A hierarchical approach to sequence generation was pursued by [8].
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
... well2
For instance, we might employ the more limited feedforward networks and a `time window' approach. In this case, the number of previous inputs to be considered as a basis for the next prediction will remain fixed.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
... step3
A unique time representation is theoretically necessary to provide $P_{s+1}$ with unambiguous information about when the failure occurred (see also the last paragraph of section 2). A unique representation of the time that went by since the last unpredicted input occurred will do as well.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
... description4
In contrast, the reduced descriptions referred to by [10] are not unambiguous.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.


Back to Recurrent Neural Networks page