1990s: Time Lags!
1990: RNNs great in principle but don’t work?
Standard RNNs: Error path integral decays exponentially! (first rigorous analysis due to Schmidhuber’s former PhD student Sepp Hochreiter 1991; compare Bengio et al 1994, and Hochreiter & Bengio & Frasconi & Schmidhuber, 2001)
Forward: yk(t)=fk (netk(t))
Error: ek(t)=fk’(netk(t)) ?i wik ei(t+1)
Back to J. Schmidhuber's Recurrent neural network page