Greatest moments of computer history are speeding upOmega Point expected around 2040 | ||||||||||||||||||||||||||||||||||||||||||||||
Schmidhuber's law states that the delays between successive radical breakthroughs in computer science decrease exponentially: each new one comes roughly twice as fast as the previous one. Compare the original article arxiv:cs.AI/0302012 or the relevant page of the local copy. Also compare the concept of an approaching historic singularity (Stanislaw Ulam, 1958), which apparently inspired Vernor Vinge's writings on the technological singularity. Do not confuse this with Moore's law (1965), which roughly states that each year or so we can pack twice as many transistors on a microchip, such that computers are getting roughly twice as fast by cost. Also do not confuse Schmidhuber's law with Schmidhuber's hypothesis (1997), which essentially says that the universe is just a by-product of a simple computational process that computes all computable universe histories, not just ours. The law has been holding for almost four centuries. Surprisingly the greatest breakthroughs even match a century-based logarithmic scale, as illustrated by the following table. | ||||||||||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||||||||||
![]() |