1941: Konrad Zuse completes the first working general-purpose computer, based on his 1936 patent application
In 2021, we are celebrating the 80th anniversary of Konrad Zuse's crowning achievement: Z3, the world's first functional program-controlled general computer, based on his patent application from 1936. Today, computers are ubiquitous. Billions of people depend on them. Only 20 years to go until the Z3 centennial in 2041!
Between 1935 and 1941, Konrad Zuse (1910-1995; pronounce: "Conrud Tsoosay") created the world's first working programmable general-purpose computer: the Z3. The corresponding patent application of the "father of the computer" dates back to 1936.[ZU36-38][Z36][RO98] In 1946, he also founded the world's first computer startup company: the Zuse-Ingenieurbüro Hopferau (IBM provided some of the venture capital for an option on Zuse's patents).
As if that was not enough to cement Zuse's legacy in computing, in the early 1940s, Zuse also designed Plankalkül, the first high-level programming language[BAU][KNU] (compare the first formal language by Gottlob Frege[FRE]). He applied it to chess in 1945[KNU] and to theorem proving in 1948.[ZU48]
Where does Zuse's Z3 fit in the history of computing? The first known gear-based computational device was the Antikythera mechanism (a kind of astronomical clock) in Ancient Greece over 2000 years ago. 1.5 millennia later, Peter Henlein still made conceptually similar machines—albeit smaller—namely, the first miniaturized pocket watches (1505). But these devices always calculated the same thing, e.g., divide minutes by 60 to get hours. The 1600s brought more flexible machines that computed answers in response to input data. The first data-processing gear-based special purpose calculator for simple arithmetics was built in 1623 by Wilhelm Schickard, one of the candidates for the title of "father of automatic computing," followed by the superior Pascaline of Blaise Pascal (1642).
In 1673, Gottfried Wilhelm Leibniz designed the first machine (the step reckoner) that could perform all four arithmetic operations, and the first with a memory.[BL16] He also described the principles of binary computers governed by punch cards (1679),[L79][L03][LA14][HO66] and defined the formal Algebra of Thought (1686)[L86][WI48][LEI21,a,b] which is deductively equivalent[LE18] to the later Boolean Algebra[BOO] (1847). Leibniz, one of the candidates for the title of "father of computer science," has been called "the world's first computer scientist"[LA14] and even "the smartest man who ever lived."[SMO13] He was not only the first to publish infinitesimal calculus,[L84] but also pursued an ambitious project to answer all possible questions through computation. His ideas (inspired by Ramon LLull[LL7]) on a universal language and a general calculus for reasoning (Characteristica Universalis & Calculus Ratiocinator[WI48][RU58]) were highly influential.[GOD21,a,b][WI48]
In the early 1930s, however, Kurt Gödel dealt a blow to Leibniz' project. He created a universal language for encoding arbitrary formalizable processes,[GOD][GOD34] and used his so-called Gödel Numbering to show that there are fundamental limitations to what can be decided or computed,[GOD] thus laying the foundations of the modern version of what's now known as theoretical computer science.[GOD21,a,b]
The pragmatic Konrad Zuse was apparently not particularly interested in such theoretical work. In 1936, five years after Gödel's famous publication,[GOD] he filed his patent application on a very practical real computer.[ZU36-38][Z36][RO98] It described the digital circuits required by programmable physical hardware, extending Leibniz' principles of binary computers governed by punch cards (1679),[L79][LA14][HO66][L86][WI48][LEI21,a,b] and predating Claude Shannon's thesis on digital circuit design (1937).[SHA37]
Zuse's Z3 of 1941 lacked an explicit conditional jump instruction "IF ... THEN GOTO ADDRESS ..." (added with little effort to a later variant for ETHZ called Z4). Of course, this did not prevent Z3 from being a universal computer.[RO98] For example, simple arithmetic tricks (e.g., multiplication by 0) can be used to temporarily make a no-op out of every instruction that should not be executed because some condition is not fulfilled.[RO98] Ignoring the inevitable storage limitations of any physical computer, the physical hardware of Z3 was indeed universal in the modern sense of the purely theoretical but impractical constructs of Gödel[GOD][GOD34,21,21a] (1931-34), Church[CHU] (1935), Turing[TUR] (1936), and Post[POS] (1936), which also did not allow for "modern" conditional jumps (they did not even have numbered memory addresses to which an instruction pointer could have jumped).
Zuse's Z3 used electromagnetic relays with visibly moving switches. The first electronic special purpose calculator (whose moving parts were electrons too small to see) was the binary ABC (US, 1942) by John Atanasoff (the "father of tube-based computing"[NASC6a]). Unlike the gear-based machines of the 1600s, ABC used vaccum tubes—today's machines use the transistor principle patented by Julius E. Lilienfeld in 1925.[LIL1-2] But unlike Zuse's Z3, ABC was not freely programmable. Neither was the electronic Colossus machine by Tommy Flowers (UK, 1943-45) used to break the Nazi code.[NASC6]
On the other hand, the concept of programs was well-known by then. Perhaps the world's first practical programmable machine was an automatic theatre made in the 1st century[SHA7a][RAU1] by Heron of Alexandria (who apparently also had the first known working steam engine—the Aeolipile). The energy source of his programmable automaton was a falling weight pulling a string wrapped around pins of a revolving cylinder. Complex instruction sequences controlling doors and puppets for several minutes were encoded by complex wrappings.
The 9th century music automaton by the Banu Musa brothers in Baghdad was perhaps the first machine with a stored program.[BAN][KOE1] It used pins on a revolving cylinder to store programs controlling a steam-driven flute—compare Al-Jazari's programmable drum machine of 1206.[SHA7b]
The first commercial program-controlled machines (punch card-based looms) were built in France around 1800 by Joseph-Marie Jacquard and others—perhaps the first "modern" programmers who wrote the world's first industrial software. They inspired Ada Lovelace and her mentor Charles Babbage (UK, circa 1840). He planned but was unable to build a programmable, general purpose computer (only his non-universal special purpose calculator led to a working 20th century replica). Unlike Babbage, Zuse (1936-41) used Leibniz' principles of binary computation (1679)[L79][LA14][HO66][L03] instead of traditional decimal computation. This greatly simplified the hardware.[LEI21,a,b] Today, most computers are binary like Z3.
In this context it seems worth pointing out the difference between programs and the more limited user-given input data of the 1600s mentioned above. Programs are instruction sequences stored on some medium, e.g., on punch cards, and can be run again and again, without human intervention. Over time the physical objects required to store programs have become lighter. Ancient machines stored them on rotating cylinders; Jacquard stored them on cardboard; Zuse stored them on 35mm film, today we often store them using electrons and magnetizable material.
The first general working programmable machine built by someone other than Zuse (1941)[RO98] was Howard Aiken's decimal MARK I (US, 1944). The much faster decimal ENIAC by Eckert and Mauchly (1945/46) was programmed by rewiring it. Both data and programs were stored in electronic memory by the "Manchester baby" (Williams, Kilburn & Tootill, UK, 1948) and the 1948 upgrade of ENIAC, which was reprogrammed by entering numerical instruction codes into read-only memory.[HAI14b] Already in 1936-38, however, Zuse may have been the first to suggest to put both program instructions and data into memory.[ZU36-38]
While Zuse's work on automatic chess players (1945)[KNU] and theorem provers (1948)[ZU48] (predating Newell & Simon's work[NS56]) was groundbreaking, it was not the first work on Artificial Intelligence (AI). Already in 1914, the Spaniard Leonardo Torres y Quevedo became the 20th century's first AI pioneer when he built the first working chess end game player (back then chess was considered as an activity restricted to the realms of intelligent creatures). The machine was still considered impressive decades later when the AI pioneer Norbert Wiener played against it at the 1951 Paris conference,[AI51][BRO21][BRU1-4] which is now often viewed as the first conference on AI—though the expression "AI" was coined only later in 1956 at another conference in Dartmouth by John McCarthy. In fact, in 1951, much of what's now called AI was still called Cybernetics, with a focus very much in line with modern AI based on deep neural networks.[DL1-2][DEC]
In 1941, Zuse's Z3 could perform roughly one elementary operation (e.g., an addition) per second. Since then, every 5 years, compute got 10 times cheaper (note that his law is much older than Moore's Law which states that the number of transistors[LIL1-2] per chip doubles every 18 months). As of 2021, 80 years after Z3, modern computers can execute about 10 million billion instructions per second for the same (inflation-adjusted) price. The naive extrapolation of this exponential trend predicts that the 21st century will see cheap computers with a thousand times the raw computational power of all human brains combined.[RAW] Where are the physical limits? According to Bremermann (1982),[BRE] a computer of 1 kg of mass and 1 liter of volume can execute at most 1051 operations per second on at most 1032 bits. The trend above will hit the Bremermann limit roughly 25 decades after Z3, around 2200. However, since there are only 2 x 1030 kg of mass in the solar system, the trend is bound to break within a few centuries, since the speed of light will greatly limit the acquisition of additional mass, e.g., in form of other solar systems, through a function ploynomial in time, as previously noted back in 2004.[OOPS2]
In 1970, long before computers had become ubiquitous, Peter's renowned Atlas of World History already listed Zuse among the 20th century's 30 most important figures, along with Einstein, Gandhi, Hitler, Lenin, Roosevelt, Mao, Picasso, etc. Zuse's historical importance has only grown with the exponential growth of computing since then. By the turn of the millennium, more than 80 streets and squares carried the name of Zuse. A collection of his writings and pictures of his machines can be found in this online archive.
In 2021, we are not only celebrating the 80th anniversary of Zuse's 1941 computer, but also the 90th anniversary of Kurt Gödel's groundbreaking 1931 paper[GOD][GOD21,a,b] which laid the foundations of theoretical computer science and the theory of AI. Gödel identified the fundamental limits of theorem proving, computing, AI, logics, and mathematics itself.[GOD][GOD34,21][BIB3] This had enormous impact on science and philosophy of the 20th century. It seems incredible that within less than a century something that once lived only in the minds of titans has become something so inalienable from modern society. The world owes these scientists a great debt. Ten years to go until the Gödel centennial in 2031, and twenty years until the Zuse centennial in 2041! Enough time to plan appropriate celebrations.