Universal CEM vs EOM with Random Input next up previous contents
Next: Probability vs Descriptive Complexity Up: Measures and Probability Distributions Previous: Universal TM-Induced Measures

Universal CEM vs EOM with Random Input

Corollary 4.3 and Lemma 4.2 below imply that $\mu^E$ and $\mu_0$ are essentially the same thing: randomly selecting the inputs of a universal EOM yields output prefixes whose probabilities are determined by the universal CEM.

Corollary 4.3   Let $\mu_0$ denote the universal CEM of Theorem 4.1. For $x \in B^*$,

\begin{displaymath}\mu^E(x) = O(\mu_0(x)).
\end{displaymath}

Lemma 4.2   For $x \in B^*$,

\begin{displaymath}\mu_0(x)= O(\mu^E(x)).
\end{displaymath}

Proof. In the enumeration of EOMs in the proof of Theorem 4.1, let EOM0 be an EOM representing $\mu_0$. We build an EOM T such that $\mu_T(x) = \mu_0(x)$. The rest follows from the Invariance Theorem (compare Def. 3.3).

T applies EOM0 to all $x \in B^*$ in dovetail fashion, and simultaneously simply reads randomly selected input bits forever. At a given time, let string variable z denote T's input string read so far. Starting at the right end of the unit interval [0,1), as the $V\bar{\mu}_0(x)$ are being updated by the algorithm of Theorem 4.1, T keeps updating a chain of finitely many, variable, disjoint, consecutive, adjacent, half-open intervals VI(x) of size $V\bar{\mu}_0(x)$ in alphabetic order on x, such that VI(y) is to the right of VI(x) if $y \succ x$. After every variable update and each increase of z, T replaces its output by the x of the VI(x) with $0.z \in VI(x)$. Since neither z nor the $VC\mu_0(x)$ in the algorithm of Theorem 4.1 can decrease (that is, all interval boundaries can only shift left), T's output cannot either, and therefore is indeed EOM-computable. Obviously the following holds:

\begin{displaymath}C\mu P_T(x) =
CP_T(x) =
C\mu_0(x)
\end{displaymath}

and

\begin{displaymath}\mu P_T(x) = \sum_{z \in B^{\sharp}} P_T(xz) = \mu_0(x).
\end{displaymath}

$\Box$


next up previous contents
Next: Probability vs Descriptive Complexity Up: Measures and Probability Distributions Previous: Universal TM-Induced Measures
Juergen Schmidhuber
2001-01-09


Related links: In the beginning was the code! - Zuse's thesis - Life, the universe, and everything - Generalized Algorithmic Information - Speed Prior - The New AI