As long as there is no compelling contrarian evidence, however, a
reasonable guess would be that our universe is indeed among the fastest
ones with *O*(1) output bits per constant time interval consumed by
algorithm **FAST**. It may even be ``locally'' computable through
simple simulated processors, each interacting with only few neighbouring
processors, assuming that the pseudorandom aspects of our universe
do not require any more global communication between spatio-temporally
separated parts than the well-known physical laws. Note that the fastest
universe evolutions include those representable as sequences of substrings
of constant length *l*, where each substring stands for the universe's
discretized state at a certain discrete time step and is computable from
the previous substring in *O*(*l*) time (compare Example 1.1).
However, the fastest universes also include those whose representations of
successive discrete time steps do grow over time and where more and more
time is spent on their computation. The expansion of certain computable
universes actually requires this.

In any case, the probability that ours will last 2^{n} times longer
than it has lasted so far is at most 2^{-n} (except, of course, when
its early states are for some reason much harder to compute than later
ones *and* we are still in an early state). This prediction also
differs from those of current mainstream physics (compare [#!Gott:93!#]
though), but obviously is not verifiable.

Related links: In the beginning was the code! - Zuse's thesis - Life, the universe, and everything - Generalized Algorithmic Information - Speed Prior - The New AI