... agree1
The final section provides some remarks for those who don't agree.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
... etc.2
In principle, it is possible to run a variant of universal search on a neural net architecture instead of a conventional digital machine. In an earlier work, it was shown (in a different context) how neural nets may ``talk about their own weights in terms of activations'' and modify their own weight matrix (Schmidhuber, 1993a, 1993b). Such self-modifying capabilities can be used to form the basis of a universal set of primitives.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
...address3
To allow for real-valued weights, set $w_{WeightPointer}$ equal to the contents of address, divided by 1000, say.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.


Back to Optimal Universal Search page
Back to Program Evolution page
Back to Algorithmic Information page
Back to Speed Prior page