Using for

(39) |

That is, objects that are hard to describe (in the sense that they have only long enumerating descriptions) have low probability.

**Proof.**
The left-hand inequality follows by definition.
To show the right-hand side, one can build an EOM *T* that computes
using not more than
*KP*^{E}(*x*) + *K*_{T}(*KP*^{E}(*x*)) + *O*(1) input
bits in a way inspired by Huffman-Coding [#!Huffman:52!#].
The claim then follows from the Invariance Theorem.
The trick is to arrange *T*'s computation such that *T*'s output converges
yet never needs to decrease lexicographically.
*T* works as follows:

This will never violate the EOM constraints: the enumerable(A)EmulateU^{E}to construct a real enumerable number 0.sencoded as a self-delimiting input programr, simultaneously run all (possibly forever running) programs onU^{E}dovetail style; whenever the output of a prefixqof any running program starts with some for the first time, set variableV(x) :=V(x) + 2^{-l(q)}(if no program has ever created output starting withxthen first createV(x) initialized by 0); whenever the output of some extensionq' ofq(obtained by possibly reading additional input bits:q'=qif none are read) lexicographically increases such that it does not equalxany more, setV(x) :=V(x) - 2^{-l(q')}.

(B)Simultaneously, starting at the right end of the unit interval [0,1), as theV(x) are being updated, keep updating a chain of disjoint, consecutive, adjacent, half-open (at the right end) intervalsIV(x) = [LV(x),RV(x)) of sizeV(x) =RV(x) -LV(x) in alphabetic order onx, such that the right end of theIV(x) of the largestxcoincides with the right end of [0,1), andIV(y) is to the right ofIV(x) if . After every variable update and each change ofs, replace the output ofTby thexof theIV(x) with .

For
the *IV*(*x*) converge towards an interval *I*(*x*) of
size *P*^{E}(*x*). For
with
*P*^{E}(*x*) > 0, we have: for any
there is a time *t*_{0} such that for all time steps *t*>*t*_{0}
in *T*'s computation, an interval
of size
will be completely covered by certain *IV*(*y*) satisfying
and
.
So for
the
also converge towards an interval *I*(*x*) of size *P*^{E}(*x*).
Hence *T* will output larger and larger *y* approximating *x* from below,
provided
.

Since any interval of size *c* within [0,1) contains a number 0.*z*
with *l*(*z*) = -*lg* *c*, in both cases there is a number 0.*s* (encodable
by some *r* satisfying
)
with *l*(*s*) =
-*lg P*^{E}(*x*) + *O*(1), such that
,
and therefore
.

Less symmetric statements can also be derived in very similar fashion:

**Proof.** Modify the proof of Theorem 5.3 for approximable
as opposed to enumerable interval boundaries and approximable 0.*s*.

A similar proof, but without the complication for the case , yields:

(41) |

(42) |

**Proof.**
Initialize variables
and
.
Dovetailing over all
,
approximate the GTM-computable
in
variables *V*_{x} initialized by zero, and create a chain of adjacent
intervals *IV*_{x} analogously to the proof of Theorem 5.3.

The *IV*_{x} converge against intervals *I*_{x} of size
.
Hence *x*
is GTM-encodable by any program *r* producing an output *s* with
:
after every update, replace the GTM's output by the *x* of
the *IV*_{x} with
.
Similarly, if 0.*s*
is in the union of adjacent intervals *I*_{y} of strings *y* starting with *x*,
then the GTM's output
will converge towards some string starting with *x*.
The rest follows in a way similar
to the one described in the
final paragraph of the proof of Theorem 5.3.

Using the basic ideas in the proofs of Theorem 5.3 and 5.5 in conjunction with Corollary 4.3 and Lemma 4.2, one can also obtain statements such as:

While

**Proof.**
For the cases
and *P*^{E}, apply Theorems 5.2, 5.6 and
the unboundedness of (12).
For the case *P*^{G}, apply Theorems 3.3 and 5.3.

Related links: In the beginning was the code! - Zuse's thesis - Life, the universe, and everything - Generalized Algorithmic Information - Speed Prior - The New AI