I recently attended an event here in DC where Woz was the guest of honor. I was unaware of his book at the time. So leading up to it, I thought it would be fun to re-watch a couple of old documentaries in which he appears. I forewent his foray into reality TV. I chose to watch "The Machine that Changed the World" (youtube) a PBS documentary from 1992 which is farsighted enough to not feel completely dated plus the chronicling of the early history of computing is nicely done. The other was Robert X Cringley’s 1996 "Triumph of the Nerds" (youtube) which is a fun account of the rise of the personal computer.
Watching these rekindled an idea I had to do a history of events from ancient to modern that got us to where we are now. So I set forth to collect and list notable events that propelled both the science and industry of computing to its present state. My goal was to include the people and events that not only shaped the technology but that also shaped the industry and the language we use to describe things. One example is the term bug that many people may take for granted. It is derived from a problem caused by an actual insect, a moth stuck in a relay. The other and perhaps more profound case is that the word "computer", a name we also take for granted, was a historical accident. Prior to 1940 a computer was a person who did computations or calculations. Many of the first machines with the exception of the decryption work done Bletchley Park were developed for doing computations, specifically ENIAC, which was initially designed to calculate artillery firing tables. ENIAC became the first well known computer and captured the public’s imagination when the "Giant Brain" was displayed to the American press in 1946. Not long after that the word computer went from describing a person to a machine. Meanwhile the Colossus machine from Bletchley Park, the project Alan Turing worked on, remained a secret until the 1970’s. Had that work been made public immediately following the war we might be calling these machines decoders or decrypters and the history of the computer might have taken some different turns, who knows maybe even Silicon on Avon, okay that’s probably a stretch.
As I was chronicling these events of which I tried to include when certain ideas and inventions occurred I noticed some interesting trends. For example some ideas like the algorithm and robotics (or mechanical automata) are quite ancient going back over a thousand years. Many of the technologies we use today date back mostly about 30 to 50 years ago and some are even older. Magnetic storage was patented in 1878, making it 19th century technology, and if you think about it, it is really is fairly primitive, I‘ve heard hard disks described as "spinning rust". Admittedly today’s hard disks contain very sophisticated electronics and electromechanics but are still technologically primitive (effectively sophisticated 19th century technology) compared to SSD technology, which is 20th century technology with DRAM from 1966 and NAND Flash being invented in 1980. The real advances, not surprisingly, are the advances in materials science, fabrication and manufacturing that take 1970’s technological developments like the active matrix display technology and make them small, cheap and ubiquitous today.
There are many interesting people who played very important and pivotal roles who seem to be largely forgotten. One of my biggest pet peeves is the cult of personality effect that seems to show up in our industry and the worst case of it is probably Steve Jobs. I remember working with one young fanboy who was telling me how great Steve Jobs was and that Steve Jobs thought of the iPad in the 1970s, to which I replied that it was depicted in a movie called 2001 from the late 1960s, which I would wager that Jobs had seen. I include that movie in my list of events because it does depict advanced computer technology. While Steve Jobs seems to garner more attention than he deserves, and I do credit Apple with certain events that shaped the industry as it is today. I think there are several more interesting people who have made much more substantial contributions. One person is Robert Noyce who cofounded Intel and invented the practical version of the integrated circuit that we use today, he was kind of a Jobs and Wozniak combo in terms of technical brilliance and business acumen. Another interesting story surrounds ARPANET, the proto-internet. While Al Gore may have tried to take credit for its invention, J. C. R. Licklider might have actually been the one who did effectively invent it. He convinced Bob Taylor of the idea that became ARPANET. Taylor is another of those people who deserves recognition as after he created the initial proposal for ARPANET he went on to lead Xerox Parc for about 15 years from its inception. Also Xerox Parc often seems to get the credit for our much of our modern computing interfaces, but again Douglas Engelbart really deserves that credit as he demonstrated most of those ideas in 1968 two years before Xerox established Parc. He of course was a big influence of that work at Parc.
A big driving force for the creation of practical computer technology was the need for machines to do tedious laborious mental work like large quantities of calculations especially to create mathematical tables that gave values of various computations such as logarithms which were used extensively prior to the 1970s before the advent of affordable desktop and pocket scientific calculators. Babbage’s work was aimed at solving this problem. During world war two the need for the creation of artillery tables exceeded the available human labor which drove the military to fund the creation of ENIAC, which was not delivered until after the war. The war also drove the Bletchley park work, again an attempt to automate tedious work of trying to manually decipher the coded messages of the Enigma Machine.
Big data is a common term today, but in many respects it has history going back over a hundred years and could be also considered a driver of the development of practical computer technology. One continual source of big data was and probably still is, is collection of census data. In 1889 Herman Hollerith developed an electromechanical tabulating machine that was used to process data for the 1890 US census. His 1889 patent had the title "Art of Compiling Statistics" and he later established a company to sell the machines which were used worldwide for census and other applications. He is credited with laying the foundations of the modern information processing industry. His company was eventually merged with four others to become IBM. In 1948 the US Census Bureau had a big data problem that was outgrowing the existing electromechanical technology. John Mauchly one of the inventors of ENIAC and cofounder of Eckert–Mauchly Computer Corporation sold the census department on its successor named UNIVAC to handle census data. The company survives today as Unisys.
A huge problem with software projects today is that of project failure and the history of computing seems to have its share as well. Software projects sometimes outright fail and all too often are late and over budget. Babbage’s Difference Engine project which was cancelled after 15 years and an expenditure that was ten times the original amount paid. The failure was due to bad management on his part and what might be called scope creep in that he got distracted by designing the more general Analytical Engine. UNIVAC is another notable early project that was delivered late and over budget.
The real challenge of a post like this is what to include. The Machine that Changed the World documentary starts with writing which was our first precise technique to record information. I went back farther to include the recording of numerical information. A big influence on the way I view history comes from a PBS show that aired in the eighties called Connections which was done by James Burke. In it he traces in a very engaging way various technological developments as cause and effect events. In a sense he demonstrated that history was an interesting living topic not the banal languid mimeographs that I experienced in school. I tried to choose events that were either the inception of something or the point at which it caused a significant change. In the spirit of Connections there are definite threads here, the thread of calculation and computation including number representation for those purposes. The thread of encryption also plays an important role as does dealing with information both in terms of storage, replication and processing. Some of the threads are obvious and some are more subtle.
History of Computing from Ancient to Modern
c. 3200 BCE: Mesopotamian Cuneiform appears. Writing may have been independently developed in other places as well. Although protowriting systems previously existed, society now had a more concise mechanism to record information.
c. 500 BCE: Pāṇini describes the grammar and morphology of Sanskrit.
100 BCE: The Greeks develop the Antikythera mechanism, the first known mechanical computation device.
c. 1 - 300 CE: Hindu numeral system, later to become the Hindu–Arabic numeral system, is invented by Indian mathematicians.
c. 801 - 873 CE: Al-Kindi develops cryptanalysis by frequency analysis.
c. 825: CE Muḥammad ibn Mūsā al-Khwārizmī writes On the Calculation with Hindu Numerals which helps spread the Indian system of numeration throughout the Middle East and Europe. It is translated into Latin as Algoritmi de numero Indorum. Al-Khwārizmī, rendered as (Latin) Algoritmi, leads to the term "algorithm".
c. 1283: Verge escapement is developed in Europe and allows the development all-mechanical clocks and other mechanical automata.
1450: Johannes Gutenberg invents the Printing Press, while actually invented earlier in Asia, it allows the mechanization of the duplication of information and creates an industrial boom surrounding the selling and dissemination of information.
c. 1624: The slide rule is developed to calculate logarithms.
1763: Two years after Thomas Bayes dies his "An Essay towards solving a Problem in the Doctrine of Chances" is published in the Philosophical Transactions of the Royal Society of London.
1786: J. H. Müller Hessian army engineer publishes a description of a Difference Engine.
1805/1809: The least squares method is published by Legendre in 1805 and by Gauss in 1809.
They both apply it to the problem of determining, from astronomical observations, the orbits of bodies about the sun. This is the earliest form of the linear regression.
1823: The British government gives Charles Babbage £1700 to start work on the Difference engine proposed the previous year. In 1842 after a total expenditure of £17,000 without a deliverable the project was canceled.
1832: Évariste Galois is killed in a duel leaving behind a legacy in algebraic theory to be discovered decades later.
1837: Charles Babbage describes a successor to the Difference Engine, the Analytical Engine, a proposed mechanical general-purpose computer and the first design for a general-purpose computer that could be described in modern terms as Turing-complete. Babbage receive assistance on this project from Ada Byron, Countess of Lovelace. It also serves to distract Babbage from completing the Difference Engine. Neither project is ever completed.
1907: The thermionic triode, a vacuum tube, propels the electronics age forward, enabling amplified radio technology and long-distance telephony.
1914: Spanish engineer Leonardo Torres y Quevedo designs an electro-mechanical version of the Analytical Engine of Charles Babbage which includes floating-point arithmetic.
1918: German engineer Arthur Scherbius applied for a patent for a cipher machine using rotors which gives rise to the enigma machine, a family of related electro-mechanical rotor cipher machines used for enciphering and deciphering secret messages.
1925 Andrey Kolmogorov publishes "On the principle of the excluded middle". He supports most of Brouwer's results but disputes a few and proves that under a certain interpretation, all statements of classical formal logic can be formulated as those of intuitionistic logic.
1931: Kurt Gödel publishes "On Formally Undecidable Propositions of "Principia Mathematica" and Related Systems".
1936: Alonzo Church and Alan Turing publish independent papers showing that a general solution to the Entscheidungsproblem is impossible. Church’s paper introduces the Lambda Calculus and Turing’s paper introduces ideas that become known Turing Machines.
1943 Warren McCulloch and Walter Pitts publish A Logical Calculus Immanent in Nervous Activity (pdf) making significant contributions to the study of neural network theory, theory of automata, and the theory of computation and cybernetics.
1944: The first Colossus computer, designed by British engineer Tommy Flowers, is operational at Bletchley Park. It was designed to break the complex Lorenz ciphers used by the Nazis during WWII and reduced the time to break Lorenz messages from weeks to hours.
1945: John von Neumann publishes the "First Draft of a Report on the EDVAC" the first published description of the logical design of a computer using the stored-program concept which came to be known as the von Neumann architecture.
1946: J. Presper Eckert and John Mauchly found Mauchly Computer Corporation, the first commercial computer company.
1947: John Bardeen, Walter Brattain, and William Shockley invent the first point-contact transistor. John R. Pierce, supervisor of the team, coins the term transistor a portmanteau of the term "transfer resistor".
1948: Claude Shannon publishes "A Mathematical Theory of Communication" as a two part series in the Bell System Technical Journal describing information entropy as a measure for the uncertainty in a message which essentially invents the field of information theory.
1948: Bernard M. Oliver, Claude Shannon and John R. Pierce publish "The Philosophy of PCM" after the application for the patents by Oliver and Shannon in 1946 and by John R. Pierce in 1945, both patent applications were of the same name: "Communication System Employing Pulse Code Modulation". All three are credited as the inventors of PCM.
1948: The world's first stored-program computer, Manchester Small-Scale Experimental Machine (SSEM), nicknamed Baby, runs its first program.
1952: Tom Watson, Jr. becomes president of IBM and shifts the company’s direction away from electromechanical punched card systems towards electronic computers.
1956: Wen Tsing Chow invents Programmable Read Only Memory (PROM) at American Bosch Arma Corporation at the request of the United States Air Force to come up with a more flexible and secure way of storing the targeting constants in the Atlas E/F ICBM's airborne digital computer.
1956: IBM introduces the IBM 305 RAMAC the first commercial computer using a moving-head hard disk drive (magnetic disk storage) for secondary storage.
1956: William Shockley moves back to California to be close to his aging mother in Palo Alto and starts Shockley Semiconductor Laboratory in a small commercial lot in nearby Mountain View which sets the agrarian area with cheap land and populated with fruit orchards on a path to later become known as Silicon Valley.
1957: Noam Chomsky publishes Syntactic Structures which includes ideas of the phrase structure grammar and a finite state grammar, communication theoretic model based on a conception of language as a Markov process.
1957: Frank Rosenblatt invents the Perceptron at the Cornell Aeronautical Laboratory with naval research funding. He publishes "The Perceptron: A Probabilistic Model for Information Storage and Organization in The Brain" in 1958 . It is initially implemented in software for the IBM 704 and subsequently implemented in custom-built hardware as the "Mark 1 Perceptron".
1959: John Backus publishes "The Syntax and Semantics of the Proposed International Algebraic Language of Zürich ACM-GAMM Conference". In the UNESCO Proceedings of the International Conference on Information Processing, which introduces a notation to describe the syntax of programming languages specifically context-free grammars, the notation later becomes known as the Backus Naur Form (BNF).
1959: Arthur Samuel defines machine learning as a "Field of study that gives computers the ability to learn without being explicitly programmed".
1962: The Radio Sector of the EIA introduces RS-232.
1964: MIT, General Electric and Bell Labs form a consortium to develop of the Multics Operating System.
1968: Douglas Engelbart gives "The Mother of All Demos", a demonstration that introduces a complete computer hardware/software system called the oN-Line System (NLS). It demonstrates many fundamental elements of modern personal computing: multiple windows, hypertext, graphics, efficient navigation and command input, video conferencing, the computer mouse, word processing, dynamic file linking, revision control, and a collaborative real-time editor (collaborative work).
1968: ARPA approves Bob Taylor’s plan for a computer network. Bob Taylor along with Ivan Sutherland were originally convinced of the idea by J. C. R. Licklider who initially referred to the concept as an "Intergalactic Computer Network". Ivan Sutherland and Bob Taylor envisioned a computer communications network, that would in part, allow researchers to use computers provided to various corporate and academic sites by ARPA to make new software and other computer science results quickly and widely available. The project became known as Arpanet.
1968: Arthur C. Clarke and Stanley Kubrick release 2001: A Space Odyssey a science fiction movie that features advanced computing technology including an AI computer named the HAL 9000 with a voice interface that manages ship operations. The astronauts use multimedia tablet technology (NewsPad) to catch up on family and worldwide events from Earth.
1970: Ken Thompson and Denis Ritche create a new lighter weight "programmers workbench" operating system with many of the robust features of Multics. The new operating system runs its first application: Space Travel. Denis Ritche coins the name Unics playing on the Multics name, later it becomes known as Unix.
1970: Xerox opens Palo Alto Research Center (PARC) which attracts some of the United States’ top computer scientists. Under Bob Taylor’s guidance as associate manager it produces many groundbreaking inventions that later transform computing, including Laser printers, Computer-generated bitmap graphics, The WYSIWYG text editor, Interpress, a resolution-independent graphical page-description language and the precursor to PostScript, Model–view–controller software architecture.
1971: IBM introduces the first commercially available 8 inch floppy diskette.
1971: Raymond Tomlinson implements the first email system able to send mail between users on different hosts connected to the ARPAnet. To achieve this, he uses the @ sign to separate the user from their machine, which has been used in email addresses ever since.
1973: Eckert and Mauchly Eniac patent is invalidated.
1974: Vint Cerf and Bob Kahn publish a paper titled "A Protocol for Packet Network Intercommunication" describing an internetworking protocol for sharing resources using packet-switching among the nodes which was henceforth called the Internet Protocol Suite, and informally known as TCP/IP.
1975: Frederick Brooks publishes The Mythical Man-Month: Essays on Software Engineering is a book on software engineering and project management
1978: Brian Kernighan and Dennis Ritchie publish "K&R" establishing a de facto standard for the C programming language, later in 1989 the American National Standards Institute published a standard for C (generally called "ANSI C" or "C89"). Development of C was started by Dennis Ritchie and Ken Thompson in 1969 in conjunction with the development of the UNIX operating system.
1978: University of North Carolina at Chapel Hill and Duke University publicly establish Usenet.
1981: IBM introduces the IBM PC.
1994: Erich Gamma, Richard Helm, Ralph Johnson and John Vlissides, known as the "gang of four" publish Design Patterns: Elements of Reusable Object-Oriented Software. Subsequently in 1999 are found guilty by The International Tribunal for the Prosecution of Crimes against Computer Science by a two thirds majority.
1995: NSFNET is decommissioned, removing the last restrictions on the use of the Internet to carry commercial traffic.
1999: Napster is released as an unstructured centralized peer-to-peer system.
1999: Darcy DiNucci coins the term Web 2.0 to describe World Wide Web sites that use technology beyond the static pages of earlier Web sites including social networking sites, blogs, wikis, folksonomies, video sharing sites, hosted services, Web applications, and mashups.. The term is later popularized by Tim O'Reilly at the O'Reilly Media Web 2.0 conference in late 2004.
1999: The Wi-Fi Alliance formed as a trade association to hold the Wi-Fi trademark under which most products are sold.
2012: The Raspberry Pi Foundation ships the first batch of Raspberry Pi’s, a credit-card-sized single-board computer.
If you made it this far you’ve survived my brief, relatively speaking, trip though five millennia of the human history of computing. It was hard to pick more recent things since we don’t know what the important events for the future are. I chose to end with Homototpy Type Theory since it is supposed to be the next big thing. Also you might have noticed that my title is similar, perhaps an homage, or just a rip off of: "A Brief, Incomplete, and Mostly Wrong History of Programming Languages". I had this idea prior to seeing his and for a long time it deterred me from writing this, but I finally got over the perception that he did it first since my original idea was quite different.
If you are still hungry for more including various references and influences on this post:
Columbia University Computing History
List of IEEE Milestones
Computer History Museum Timeline of Computer History
Wikipedia History of Computing
Wikipedia Timeline of Computing
PBS American Experience: Silicon Valley
PBS Nova’s Decoding Nazi Secrets
James Burke’s The Day the Universe Changed