next up previous
Next: Life in a Given Up: In C. Freksa, ed., Previous: All Universes are Cheaper

Are we Run by a Short Algorithm?

Since our universes' history so far is regular, it by itself could have been computed by a relatively short algorithm. Essentially, this algorithm embodies the physical laws plus the information about the historical noise. But there are many algorithms whose output sequences start with our universe's history so far. Most of them are very long. How likely is it now that our universe is indeed run by a short algorithm? To attempt an answer, we need a prior probability on the possible algorithms. The obvious candidate is the ``universal prior''.

Universal prior. Define PU(s), the a priori probability of a finite symbol string s (such as the one representing our universe's history so far), as the probability of guessing a halting program that computes s on a universal Turing machine U. Here, the way of guessing is defined by the following procedure: initially, the input tape consists of a single square. Whenever the scanning head of the program tape shifts to the right, do: (1) Append a new square. (2) With probability $\frac{1}{3}$ fill it with a ``0''; with probability $\frac{1}{3}$ fill it with a ``1''; with probability $\frac{1}{3}$ fill it with a ``,''. Programs are ``self-delimiting'' [#!Levin:74!#,#!Chaitin:87!#] -- once U halts due to computations based on the randomly chosen symbols (the program) on its input tape, there won't be any additional program symbols. We obtain

\begin{displaymath}P_U(s) = \sum_{p: U~computes~s~from~p~and~halts} (\frac{1}{3})^{\mid p \mid}.
\end{displaymath}

Clearly, the sum of all probabilities of all halting programs cannot exceed 1 (no halting program can be the prefix of another one). But certain programs may lead to non-halting computations.

Under different universal priors (based on different universal machines), probabilities of a given string differ by no more than a constant factor independent of the string size, due to the compiler theorem (the constant factor corresponds to the probability of guessing a compiler). This justifies the name ``universal prior,'' also known as Solomonoff-Levin distribution.

Dominance of shortest programs. It can be shown (the proof is non-trivial) that the probability of guessing any of the programs computing some string and the probability of guessing one of its shortest programs are essentially equal (they differ by no more than a constant factor depending on the particular Turing machine). The probability of a string is dominated by the probabilities of its shortest programs. This is known as the ``coding theorem'' [#!Levin:74!#]. Similar coding theorems exist for the case of non-halting programs which cease requesting additional input symbols at a certain point.

Now back to our question: are we run by a relatively compact algorithm? So far our universe could have been run by one -- its history could have been much noisier and thus much less compressible. Hence universal prior and coding theorems suggest that the algorithm is indeed short. If it is, then there will be less than maximal randomness in our future, and more than vanishing predictability. We may hope that our universe will remain regular, as opposed to drifting off into irregularity.


next up previous
Next: Life in a Given Up: In C. Freksa, ed., Previous: All Universes are Cheaper
Juergen Schmidhuber
1999-03-15


Related links: In the beginning was the code! - Zuse's thesis - Algorithmic Theories of Everything - Generalized Algorithmic Information - Speed Prior - The New AI