Tighter Bounds? next up previous contents
Next: Between EOMs and GTMs? Up: Probability vs Descriptive Complexity Previous: Theorems for EOMs and

Tighter Bounds?

Is it possible to get rid of the small correction terms such as $K^E(KP^E(x)) \leq O(log(-log P^E(x))$ in Theorem 5.3? Note that the construction in the proof shows that KE(x) is actually bounded by KE(s), the complexity of the enumerable number $0.s \in I(x)$ with minimal KT(s). The facts $\sum_x P^M(x) = 1$, $\sum_x P^E(x) = 1$, $\sum_x P^G(x) < 1$, as well as intuition and wishful thinking inspired by Shannon-Fano Theorem [#!Shannon:48!#] and Coding Theorem 5.1 suggest there might indeed be tighter bounds:

Conjecture 5.1   For $x \in B^{\sharp}$ with PM(x) > 0: $K^M(x) \leq KP^M(x) + O(1)$.

Conjecture 5.2   For $x \in B^{\sharp}$ with PE(x) > 0: $K^E(x) \leq KP^E(x) + O(1)$.

Conjecture 5.3   For $x \in B^{\sharp}$ with PG(x) > 0: $K^G(x) \leq KP^G(x) + O(1)$.

The work of Gács has already shown, however, that analogue conjectures for semimeasures such as $\mu^M$ (as opposed to distributions) are false [#!Gacs:83!#].

Juergen Schmidhuber

Related links: In the beginning was the code! - Zuse's thesis - Life, the universe, and everything - Generalized Algorithmic Information - Speed Prior - The New AI