Relation to Previous Work on All Possible Universes next up previous contents
Next: Concluding Remarks Up: Consequences for Physics Previous: Short Algorithm Detectable?

Relation to Previous Work on All Possible Universes

A previous paper on computable universes [#!Schmidhuber:97brauer!#] already pointed out that computing all universes with all possible types of physical laws tends to be much cheaper in terms of information requirements than computing just one particular, arbitrarily chosen one, because there is an extremely short algorithm that systematically enumerates and runs all computable universes, while most individual universes have very long shortest descriptions. The subset embodied by the many worlds of Everett III's ``many worlds hypothesis'' [#!Everett:57!#] was considered a by-product of this more general set-up.

The previous paper apparently also was the first to apply the theory of inductive inference to entire universe histories [#!Schmidhuber:97brauer!#, Section: Are we Run by a Short Algorithm?], using the Solomonoff-Levin distribution to predict that simple universes are more likely; that is, the most probable universe is the simplest one compatible with our existence, where simplicity is defined in terms of traditional Kolmogorov complexity -- compare everything mailing list archive: http://www.escribe.com/science/theory/m1284.html and m1312.html, as well as recent papers by Standish and Soklakov [#!Standish:00!#,#!Soklakov:00!#], and see Calude and Meyerstein [#!Calude:99!#] for a somewhat contrarian view.

The current paper introduces simplicity measures more dominant than the traditional ones [#!Kolmogorov:65!#,#!Solomonoff:64!#,#!Solomonoff:78!#,#!Chaitin:69!#,#!Zvonkin:70!#,#!Levin:73a!#,#!Levin:74!#,#!Gacs:74!#,#!Chaitin:75!#,#!Gacs:83!#,#!Schnorr:73!#,#!Chaitin:87!#,#!Gellmann:95!#,#!LiVitanyi:97!#], and provides a more general, more technical, and more detailed account, incorporating several novel theoretical results based on generalizations of Kolmogorov complexity and algorithmic probability. In particular, it stretches the notions of computability and constructivism to the limits, by considering not only MTM-based traditional computability but also less restrictive GTM-based and EOM-based describability, and proves several relevant ``Occams razor theorems.'' Unlike the previous paper [#!Schmidhuber:97brauer!#] it also analyzes fundamental time constraints on the computation of everything, and derives predictions based on these restrictions.

Rather than pursuing the computability-oriented path layed out in [#!Schmidhuber:97brauer!#], Tegmark recently suggested what at first glance seems to be an alternative ensemble of possible universes based on a (somewhat vaguely defined) set of ``self-consistent mathematical structures'' [#!Tegmark:98!#], thus going beyond his earlier, less general work [#!Tegmark:96!#] on physical constants and Everett's many world variants [#!Everett:57!#] of our own particular universe -- compare also Marchal's and Bostrom's theses [#!Marchal:98!#,#!Bostrom:00!#]. It is not quite clear whether Tegmark would like to include universes that are not formally describable according to Def. 2.5. It is well-known, however, that for any set of mathematical axioms there is a program that lists all provable theorems in order of the lengths of their shortest proofs encoded as bitstrings. Since the TM that computes all bitstrings outputs all these proofs for all possible sets of axioms, Tegmark's view [#!Tegmark:98!#] seems in a certain sense encompassed by the algorithmic approach [#!Schmidhuber:97brauer!#]. On the other hand, there are many formal axiomatic systems powerful enough to encode all computations of all possible TMs, e.g., number theory. In this sense the algorithmic approach is encompassed by number theory.

The algorithmic approach, however, offers several conceptual advantages: (1) It provides the appropriate framework for issues of information-theoretic complexity traditionally ignored in pure mathematics, and imposes natural complexity-based orderings on the possible universes and subsets thereof. (2) It taps into a rich source of theoretical insights on computable probability distributions relevant for establishing priors on possible universes. Such priors are needed for making probabilistic predictions concerning our own particular universe. Although Tegmark suggests that ``... all mathematical structures are a priori given equal statistical weight'' [#!Tegmark:98!#](p. 27), there is no way of assigning equal nonvanishing probability to all (infinitely many) mathematical structures. Hence we really need something like the complexity-based weightings discussed in [#!Schmidhuber:97brauer!#] and especially the paper at hand. (3) The algorithmic approach is the obvious framework for questions of temporal complexity such as those discussed in this paper, e.g., ``what is the most efficient way of simulating all universes?''


next up previous contents
Next: Concluding Remarks Up: Consequences for Physics Previous: Short Algorithm Detectable?
Juergen Schmidhuber
2001-01-09


Related links: In the beginning was the code! - Zuse's thesis - Life, the universe, and everything - Generalized Algorithmic Information - Speed Prior - The New AI