A resource-oriented point of view
suggests the following postulate.
The cumulative prior probability measure of all x
incomputable within time t
by the most efficient way of computing everything
should be inversely proportional to t.
Since the most efficient way of computing all x is embodied by
FAST, and since each phase of FAST consumes roughly twice the
time and space resources of the previous phase, the cumulative prior
probability of each finite phase should be roughly half the one of
the previous phase; zero probability should be assigned to infinitely
resource-consuming phases. Postulate 6.1 therefore suggests the
Definition 6.3 (Speed Prior tex2html_wrap_inline$S$)
Define the speed prior S on B* as
We observe that S(x) is indeed a semimeasure (compare Def. 4.1):
is first computed in PHASE Kt(x) within
2Kt(x)+1 steps, we may rewrite:
S can be implemented by the following
probabilistic algorithm for a universal MTM.
Toss an unbiased coin until heads is up;
let i denote the number of required trials;
If the number of steps executed so far exceeds t then exit.
Execute one step; if it is a request for an input bit,
toss the coin to determine the bit, and set t:=t/2.
3. Go to 2.
In the spirit of FAST,
algorithm GUESS makes twice the computation time half as likely,
and splits remaining time in half whenever a new bit is requested,
to assign equal runtime to the two resulting sets of possible
Note that the expected runtime of GUESS is unbounded since
does not converge. Expected runtime is countable,
however, and expected space is of the order of expected time,
due to numerous short algorithms producing a constant
number of output bits per constant time interval.
Assuming our universe is sampled according to GUESS implemented on some
machine, note that the true distribution is not essentially different
from the estimated one based on our own, possibly different machine.