next up previous
Next: The Neural Heat Exchanger Up: The Neural Heat Exchanger Previous: The Neural Heat Exchanger

Introduction

Most conventional supervised algorithms for multi-layer neural nets are not local in space and time. Backprop, for instance, requires a global control mechanism that first propagates activation signals through all successive layers, then waits until the error signals come back, then changes the weights. Many suspect, however, that the brain does use an entirely local algorithm. One advantage of truly local algorithms is that their parallel implementation is trivial. The method to be described below is designed to be entirely local while still being able to deal with hidden units and non-linearities [5]. See [4] for another local alternative.



Juergen Schmidhuber 2003-02-28