Greatest moments of computer history are speeding up

Omega Point expected around 2040

Schmidhuber's law states that the delays between successive radical breakthroughs in computer science decrease exponentially: each new one comes roughly twice as fast as the previous one. Compare the original article arxiv:cs.AI/0302012 or the relevant page of the local copy. Also compare the concept of an approaching historic singularity (Stanislaw Ulam, 1958), which apparently inspired Vernor Vinge's writings on the technological singularity.

Do not confuse this with Moore's law (1965), which roughly states that each year or so we can pack twice as many transistors on a microchip, such that computers are getting roughly twice as fast by cost. Also do not confuse Schmidhuber's law with Schmidhuber's hypothesis (1997), which essentially says that the universe is just a by-product of a simple computational process that computes all computable universe histories, not just ours.

The law has been holding for almost four centuries. Surprisingly the greatest breakthroughs even match a century-based logarithmic scale, as illustrated by the following table.

125 white 500 white
1623
Wilhelm Schickard Wilhelm Schickard (1623) starts the computing age by building the first calculators (no evidence for claims that Leonardo da Vinci built such machines even earlier)
~2
centuries
later
Charles Babbage 1834-1840: Charles Babbage (UK) envisions programmable computers. This is a major conceptual breakthrough - all previous machines were `hardwired.' Babbage cannot get his decimal designs to work though.
1
century
later
Konrad Zuse Kurt Goedel Alan Turing Modern era begins: Konrad Zuse builds first general purpose computer 1935-41.
still 1
century
later
A bit earlier, Kurt Goedel (Austria) publishes his fundamental work on universal formal languages and the limits of proof and computation (1931). His results are reformulated by Alan Turing (UK, 1936) who later helps to break the Nazi code (1943).
1/2
century
later
Tim Berners-Lee The next 50 years bring advances in theory and switching speed: relays are replaced by tubes by transistors by chips. Arguably rather predictable progress! But in 1990 UK's Tim Berners-Lee at CERN (Switzerland) shakes up everything again by creating the WWW.
1/4
century
later

?

Extrapolating the trend, optimists should expect another radical change by 2015, which happens to coincide with the date when the fastest computers will match brains in terms of raw computing power, according to frequent estimates.

.
.
.

?

Omega
point
(~2040)

?

The remaining series of faster and faster additional revolutions should culminate in an "Omega point" (Teilhard de Chardin, 1916) expected around 2030-2040.




Zuse was listed among the 20th century's 30 most important persons in Peter's synchronoptic atlas of world history. Goedel, Turing, and Berners-Lee at least made it into TIME magazine's list of the century's 20 most influential scientists, together with this man.

Other great pioneers include Pascal (1640 calculator), Leibniz (binary system, automatic multiplier in 1670), Jacquard (punch cards, 1801), Babbage's friend Lady Ada (the first programmer?), Hollerith (more punch cards, 1889), Aiken (MARK I, 1944), von Neumann (architecture: 1945), Atanasoff / Eckert / Mauchly (tubes replace relays, e.g., ENIAC 1945), Lilienfeld, Heil, Shockley / Brattain / Bardeen, Matare / Walker (transistors replace tubes, 1928-1948), Wilkes und Renwick (EDSAC, program and data both modifiable in storage, 1949), Kilby (integrated circuits replace single transistors, 1958), Jobs and Wozniak (personal computers, 1976), and many others. Outstanding contributors to theoretical computer science also include Post, Church, Kolmogorov, Shannon, Wiener, Hoare, Dijkstra, Chomsky, and numerous others.

*

German