OPTIMAL UNIVERSAL SEARCH

Unbeknownst to many computer scientists (including some who wrote books on search algorithms) there exists a very simple search method with an astonishing theoretical property: for a broad class of non-incremental search problems, universal search or Levin Search (LS) has the optimal order of computational complexity (L. Levin, Problems of Information Transmission, 9(3):265--266, 1973). For example, suppose there is an unknown algorithm that solves some inversion problem in O(f(n)) steps, where f is a total recursive function and n a positive integer representing problem size. Then LS will solve the same problem in at most O(f(n)) steps. Unfortunately, however, the constant factor buried in the O() notation may be huge.

Recently Marcus Hutter (funded through Schmidhuber's SNF research grant "Unification of Universal Induction and Sequential Decision Theory") was able to derive an even more general optimal search algorithm which actually reduces the constant factor down to less than 5 (at the expense of introducing an unknown, problem class-specific additive constant):

1. M. Hutter. The Fastest and Shortest Algorithm for All Well-Defined Problems. International Journal of Foundations of Computer Science, 13(3):431-443, 2002.
2. Levin's and Hutter's asymptotically optimal methods are non-incremental and ignore the huge unknown constants. To overcome this drawback, Schmidhuber extended the principles of non-incremental universal search to build a novel, optimally fast, incremental learner that is able to improve itself through experience. The Optimal Ordered Problem Solver (OOPS, J. Schmidhuber, July 2002, MLJ 2004) searches for a universal algorithm that solves each task in a sequence of tasks. It continually organizes and exploits previously found solutions to earlier tasks, efficiently searching not only the space of domain-specific algorithms, but also the space of search algorithms. It can be used for efficient proof search by the even more general Gödel machine.
First applications of LS and related methods were earlier described in:

3. J. Schmidhuber. Discovering neural nets with low Kolmogorov complexity and high generalization capability. Neural Networks, 10(5):857-873, 1997 (123 K). PDF , HTML

4. J.  Schmidhuber. Discovering solutions with low Kolmogorov complexity and high generalization capability. In A. Prieditis and S. Russell, editors, Machine Learning: Proceedings of the Twelfth International Conference, pages 488-496. Morgan Kaufmann Publishers, San Francisco, CA, 1995. PDF . HTML.
To automatically generate computer programs solving complex problems in incremental settings, Schmidhuber also introduced Adaptive Levin Search (ALS):

5. J. Schmidhuber, J. Zhao, and M. Wiering. Shifting inductive bias with success-story algorithm, adaptive Levin search, and incremental self-improvement. Machine Learning 28:105-130, 1997. PDF . Flawed HTML.

6. M. Wiering and J. Schmidhuber. Solving POMDPs using Levin search and EIRA. In L. Saitta, ed., Machine Learning: Proceedings of the 13th International Conference, pages 534-542, Morgan Kaufmann Publishers, San Francisco, CA, 1996. PDF . HTML.
*