 |
 |
 |
 |
 |
 |
 |
 |
Last update: November 2013
|
|
Jürgen Schmidhuber's
|
. |
 |
Generalized Algorithmic Information
Generalized Algorithmic Probability
Super Omegas
Non-enumerable but limit-computable! No oracles!
|
Talk slides
|
|
The algorithmic information or Kolmogorov complexity of a
bitstring x is the length of
the shortest program that computes x and halts.
This is one of the fundamental concepts of theoretical
computer science.
Follow this link to the
Kolmogorov mailing list home page at IDSIA.
What if we allow for nonhalting computations on nonstandard
Turing machines?
It turns out that some things then become much more
compactly describable. This leads to generalizations
of the concept of Kolmogorov complexity, and
has consequences for
Solomonoff's
theory of algorithmic probability and
universal prediction. It also leads to "Super Omegas"
that are computable in the limit -
generalizations of Chaitin's halting probability
Omega of a Turing machine with random input.
.
| |
J. Schmidhuber.
Hierarchies of generalized Kolmogorov complexities and
nonenumerable universal measures computable in the limit.
International Journal of Foundations of Computer Science
13(4):587-612 (2002).
PS.
PDF.
Flawed HTML (knowledge of LATEX helps).
Based on sections 2-5 of
TR IDSIA-20-00, Version 2.0: Algorithmic Theories of Everything,
PDF.
Dec 20, 2000
(515K, 50 pages, 10 theorems, 100 refs). HTML.
Also in the physics
arXiv: quant-ph/ 0011122.
Overview.
.
|
|
Letter
on what's random in physics
(Nature 439, 392, 26 Jan 2006)
| |
Related machine learning work led to procedures
for finding low-complexity neural networks
matching given training data:
J. Schmidhuber.
Discovering neural nets with low Kolmogorov complexity
and high generalization capability.
Neural Networks, 10(5):857-873, 1997.
PS.GZ.
PDF.
HTML.
J. Schmidhuber.
Discovering solutions with low Kolmogorov complexity
and high generalization capability.
In Proc. ICML'95, pages 488-496. Morgan
Kaufmann, San Francisco, CA, 1995.
PS.GZ.
PDF.
HTML.
More recent work of 2013:
Compressed Network Search Finds Complex Neural Controllers with a Million Weights,
learns to drive without a teacher from raw high-dimensional video input
|
|
Abstract of IJFCS paper:
The traditional theory of Kolmogorov complexity and algorithmic
probability focuses on monotone Turing machines with one-way
write-only output tape. This naturally leads to the universal enumerable
Solomonoff-Levin measure. Here we introduce more general, nonenumerable
but cumulatively enumerable measures (CEMs) derived from Turing machines
with lexicographically nondecreasing output and random input, and
even more general approximable measures and distributions computable
in the limit. We obtain a natural hierarchy of generalizations of
algorithmic probability and Kolmogorov complexity, suggesting that the
``true'' information content of some (possibly infinite) bitstring x
is the size of the shortest nonhalting program that converges to x and
nothing but x on a Turing machine that can edit its previous outputs.
Among other things we show that there are `Super Omegas' computable in the
limit yet more random than Chaitin's ``number of wisdom'' Omega, that
any approximable measure of x is small for any x lacking a short
description, that there is no universal approximable distribution, that
there is a universal CEM, and that any nonenumerable CEM of x is small
for any x lacking a short enumerating program. We briefly mention
consequences for
universes sampled from such priors.
More recent paper on this topic (2006):
A. Chernov, J. Schmidhuber.
Prefix-like Complexities and Computability in the Limit.
Proc. of Second Conference on Computability in Europe, CiE 2006, LNCS 3988, pp. 85-93.
Based on TR IDSIA-11-05: PDF.
Abstract:
Generalized Turing machines (GTMs) are a variant of non-halting Turing machines, by
computational power similar to machines with the oracle for the halting problem. GTMs allow
a definition of a kind of descriptive (Kolmogorov) complexity that is uniform for finite and
infinite sequences. There are several natural modifications of the definition (as there are
several monotone complexities). This paper studies these definitions and compares complexities
defined with the help of GTMs and complexities defined with the help of oracle machines.
| |
|
Summer 2004:
Jan Poland
(IDSIA) proves perfect coding
theorem for enumerable output machines, by
redefining EOM complexity and adapting Schmidhuber's proof (IJFCS 2002).
|
|
Abstract:
Recently, Schmidhuber proposed a new concept of generalized algorithmic
complexity. It allows for the description of both finite and infinite sequences.
The resulting distributions are true probabilities rather than semimeasures.
We clarify some points for this setting, concentrating on Enumerable Output
Machines. As our main result, we prove a strong coding theorem (without
logarithmic correction terms), which was left as an open problem. To this
purpose, we introduce a more natural definition of generalized complexity.
| |
J. Poland. A coding theorem for enumerable output machines.
Information Processing Letters 91(4):157-161, 2004.
Communicated by P. M. B. Vitanyi;
available online as preprint IDSIA-05-04:
PDF
There even is an art form based on short programs:
Low-Complexity Art.
|
|
|