14 August 2014

A Brief and Incomplete History of Computing from Ancient to Modern


I recently attended an event here in DC where Woz was the guest of honor.  I was unaware of his book at the time.  So leading up to it, I thought it would be fun to re-watch a couple of old documentaries in which he appears.  I forewent his foray into reality TV.  I chose to watch "The Machine that Changed the World" (youtube)  a PBS documentary  from 1992 which is farsighted enough to not feel completely dated plus the chronicling of the early history of computing is nicely  done.   The other was Robert X Cringley’s  1996 "Triumph of the Nerds" (youtube) which is a fun account of the rise of the personal computer.


ENAIC took up 1800 square feet


Watching these rekindled an idea I had to do a history of events from ancient to modern that got us to where we are now.  So I set forth to collect and list notable events that propelled both the science and industry of computing to its present state.  My goal was to include the people and events that not only shaped the technology but that also shaped the industry and the language we use to describe things.  One example is the term bug that many people may take for granted.  It is derived from a problem caused by an actual insect, a moth stuck in a relay.  The other and perhaps more profound case is that the word "computer", a name we also take for granted, was a historical accident.  Prior to 1940 a computer was a person who did computations or calculations.  Many of the first machines with the exception of the decryption work done Bletchley Park were developed for doing computations, specifically ENIAC, which was initially designed to calculate artillery firing tables.  ENIAC became the first well known computer and captured the public’s imagination when the "Giant Brain" was displayed to the American press in 1946.  Not long after that the word computer went from describing a person to a machine.  Meanwhile the Colossus machine from Bletchley Park, the project Alan Turing worked on, remained a secret until the 1970’s.  Had that work been made public immediately following the war we might be calling these machines decoders or decrypters and the history of the computer might have taken some different turns, who knows maybe even Silicon on Avon, okay that’s probably a stretch. 


As I was chronicling these events of which I tried to include when certain ideas and inventions occurred I noticed some interesting trends.  For example some ideas like the algorithm and robotics (or mechanical automata) are quite ancient going back over a thousand years.  Many of the technologies we use today date back mostly about 30 to 50 years ago and some are even older.  Magnetic storage was patented in 1878, making it 19th century technology, and if you think about it, it is really is fairly primitive, I‘ve heard hard disks described as "spinning rust".  Admittedly today’s hard disks contain very sophisticated electronics and electromechanics but are still technologically primitive (effectively sophisticated 19th century technology) compared to SSD technology, which is 20th century technology with DRAM from 1966 and NAND Flash being invented in 1980.  The real advances, not surprisingly, are the advances in materials science, fabrication and manufacturing that take 1970’s technological developments like the active matrix display technology and make them small, cheap and ubiquitous today.  


There are many interesting people who played very important and pivotal roles who seem to be largely forgotten.  One of my biggest pet peeves is the cult of personality effect that seems to show up in our industry and the worst case of it is probably Steve Jobs.  I remember working with one young fanboy who was telling me how great Steve Jobs was and that Steve Jobs thought of the iPad in the 1970s, to which I replied that it was depicted in a movie called 2001 from the late 1960s, which I would wager that Jobs had seen. I include that movie in my list of events because it does depict advanced computer technology.  While Steve Jobs seems to garner more attention than he deserves, and I do credit Apple with certain events that shaped the industry as it is today.  I think there are several more interesting people who have made much more substantial contributions. One person is Robert Noyce who cofounded Intel and invented the practical version of the integrated circuit that we use today, he was kind of a Jobs and Wozniak combo in terms of technical brilliance and business acumen.   Another interesting story surrounds ARPANET, the proto-internet.  While Al Gore may have tried to take credit for its invention, J. C. R. Licklider might have actually been the one who did effectively invent it.  He convinced Bob Taylor of the idea that became ARPANET.  Taylor is another of those people who deserves recognition as after he created the initial proposal for ARPANET he went on to lead Xerox Parc for about 15 years from its inception.   Also Xerox Parc often seems to get the credit for our much of our modern computing interfaces, but again Douglas Engelbart really deserves that credit as he demonstrated most of those ideas in 1968 two years before Xerox established Parc. He of course was a big influence of that work at Parc.


A big driving force for the creation of practical computer technology was the need for machines to do tedious laborious mental work like large quantities of calculations especially to create mathematical tables that gave values of various computations such as logarithms which were used extensively prior to the 1970s before the advent of affordable desktop and pocket scientific calculators.  Babbage’s work was aimed at solving this problem.  During world war two the need for the creation of artillery tables exceeded the available human labor which drove the military to fund the creation of ENIAC, which was not delivered until after the war.  The war also drove the Bletchley park work, again an attempt to automate tedious work of trying to manually decipher the coded messages of the Enigma Machine.



Big data circa 1950s. ~ 4GB of punched cards.

Big data is a common term today, but in many respects it has history going back over a hundred years and could be also considered a driver of the development of practical computer technology.  One continual source of big data was and probably still is, is collection of census data.  In 1889 Herman Hollerith developed an electromechanical tabulating machine that was used to process data for the 1890 US census.  His 1889 patent had the title "Art of Compiling Statistics" and he later established a company to sell the machines which were used worldwide for census and other applications. He is credited with laying the foundations of the modern information processing industry. His company was eventually merged with four others to become IBM.  In 1948 the US Census Bureau had a big data problem that was outgrowing the existing electromechanical technology.  John Mauchly one of the inventors of ENIAC and cofounder of Eckert–Mauchly Computer Corporation sold the census department on its successor named UNIVAC to handle census data.  The company survives today as Unisys.


Grace Murray Hopper at the UNIVAC keyboard

A huge problem with software projects today is that of project failure and the history of computing seems to have its share as well.  Software projects sometimes outright fail and all too often are late and over budget.  Babbage’s Difference Engine project which was cancelled after 15 years and an expenditure that was ten times the original amount paid.  The failure was due to bad management on his part and what might be called scope creep in that he got distracted by designing the more general Analytical Engine.  UNIVAC is another notable early project that was delivered late and over budget.


The real challenge of a post like this is what to include. The Machine that Changed the World documentary starts with writing which was our first precise technique to record information. I went back farther to include the recording of numerical information. A big influence on the way I view history comes from a PBS show that aired in the eighties called Connections which was done by James Burke. In it he traces in a very engaging way various technological developments as cause and effect events. In a sense he demonstrated that history was an interesting living topic not the banal languid mimeographs that I experienced in school. I tried to choose events that were either the inception of something or the point at which it caused a significant change. In the spirit of Connections there are definite threads here, the thread of calculation and computation including number representation for those purposes. The thread of encryption also plays an important role as does dealing with information both in terms of storage, replication and processing. Some of the threads are obvious and some are more subtle.


History of Computing from Ancient to Modern

Pre 3000 BCE: Tally Marks, a unary numeral system, are used to keep track of numerical quantities such as animal herd sizes.

c. 3200 BCE:  Mesopotamian Cuneiform appears.  Writing may have been independently developed in other places as well. Although protowriting systems previously existed, society now had a more concise mechanism to record information.


c. 3000 BCE:  The development of the Egyptian Hieroglyphic Number System and the Sumerian and Babylonian Number System base 60 number system is the start of non-unary number systems.


2800 BCE: The appearance of the I Ching which included the Bagua or eight trigrams is possibly the first occurrence of binary numbers.

2700–2300 BCE: The Sumerian Sexagesimal Abacus allows human mechanical calculations.


c. 500 BCE: Pāṇini describes the grammar and morphology of Sanskrit.


c. 300 BCE: Euclid's Elements describes among other things an algorithm to compute the greatest common divisor.

100 BCE: The Greeks develop the Antikythera mechanism, the first known mechanical computation device.


c. 5 BCE:  The scytale transposition cipher was used by the Spartan military, although ciphers probably existed earlier.


c. 1 - 300 CE:  Hindu numeral system, later to become the Hindu–Arabic numeral system, is invented by Indian mathematicians.


c. 201 - 299 CE: Diophantus authored a series of books called Arithmetica which deal with solving algebraic equations.


c. 801 - 873 CE: Al-Kindi develops cryptanalysis by frequency analysis.

c. 825: CE Muḥammad ibn Mūsā al-Khwārizmī writes On the Calculation with Hindu Numerals which helps spread the Indian system of numeration throughout the Middle East and Europe. It is translated into Latin as Algoritmi de numero Indorum. Al-Khwārizmī, rendered as (Latin) Algoritmi, leads to the term "algorithm".


850 CE: The Banu Musa publish The Book of Ingenious Devices a large illustrated work on mechanical devices, including automata.


1202: Leonardo of Pisa (Fibonacci) introduces the Hindu–Arabic numbers to Europe in his book Liber Abaci.


c. 1283: Verge escapement is developed in Europe and allows the development all-mechanical clocks and other mechanical automata.

1450:  Johannes Gutenberg invents the Printing Press, while actually invented earlier in Asia, it allows the mechanization of the duplication of information and creates an industrial boom surrounding the selling and dissemination of information.


c. 1624: The slide rule is developed to calculate logarithms.


1642: Blaise Pascal and Wilhelm Schickard invent a mechanical calculator known as Pascal's calculator. It is later discovered that Schickard had described the device to Johannes Kepler in 1623.


1654: Pierre de Fermat’s and Blaise Pascal’s correspondence leads to mathematical methods of probability.


1679: Gottfried Leibniz invents the modern binary number system.

1694: Gottfried Wilhelm Leibniz completes the Stepped Reckoner a digital mechanical calculator that can perform all four arithmetic operations: addition, subtraction, multiplication and division.


1725: Basile Bouchon first uses punched cards.


1736: Leonhard Euler lays the foundation of graph theory and topology in the Seven Bridges of Königsberg problem.


1763: Two years after Thomas Bayes dies his "An Essay towards solving a Problem in the Doctrine of Chances" is published in the Philosophical Transactions of the Royal Society of London.


1786: J. H. Müller Hessian army engineer publishes a description of a Difference Engine.

1801: Joseph Marie Jacquard demonstrates a working mechanical loom, the Jacquard Loom, that uses punch cards based on previous work by Jacques Vaucanson and Basile Bouchon.


1805/1809: The least squares method is published by Legendre in 1805 and by Gauss in 1809.
They both apply it to the problem of determining, from astronomical observations, the orbits of bodies about the sun. This is the earliest form of the linear regression.


1823: The British government gives Charles Babbage £1700 to start work on the Difference engine proposed the previous year.  In 1842 after a total expenditure of £17,000 without a deliverable the project was canceled.


1832: Évariste Galois is killed in a duel leaving behind a legacy in algebraic theory to be discovered decades later.


1837: Charles Babbage describes a successor to the Difference Engine, the Analytical Engine, a proposed mechanical general-purpose computer and the first design for a general-purpose computer that could be described in modern terms as Turing-complete.  Babbage receive assistance on this project from Ada Byron, Countess of Lovelace. It also serves to distract Babbage from completing the Difference Engine. Neither project is ever completed.

1844: Samuel Morse demonstrates the telegraph to Congress by transmitting the famous message "What hath God wrought" in Morse code over a wire from Washington to Baltimore.


1854 English mathematician George Boole publishes An Investigation of the Laws of Thought.


1878: Oberlin Smith patents magnetic storage in the form of wire recording.

1889: Herman Hollerith develops an electromechanical tabulating machine using punched cards for the United States 1890 census.


1889: Giuseppe Peano presents his "The principles of arithmetic", based on the work of Dedekind and origination of "primitive recursion" begins formally with the axioms of Peano.


1897: Karl Ferdinand Braun builds the first cathode-ray tube (CRT) and cathode ray tube oscilloscope.


1900: German mathematician David Hilbert publishes a list of 23 unsolved problems.


1906: Andrey Markov publishes his first paper on Markov Chains


1907: The thermionic triode, a vacuum tube, propels the electronics age forward, enabling amplified radio technology and long-distance telephony.


1910: Alfred North Whitehead and Bertrand Russell publish Principia Mathematica which introduces the idea of types.


1914: Spanish engineer Leonardo Torres y Quevedo designs an electro-mechanical version of the Analytical Engine of Charles Babbage which includes floating-point arithmetic.

1918: German engineer Arthur Scherbius applied for a patent for a cipher machine using rotors which gives rise to the enigma machine, a family of related electro-mechanical rotor cipher machines used for enciphering and deciphering secret messages.


1928: David Hilbert formulates the Entscheidungsproblem, German for 'decision problem', from his 2nd and 10th problems.


1925 Andrey Kolmogorov publishes "On the principle of the excluded middle".  He supports most of Brouwer's results but disputes a few and proves that under a certain interpretation, all statements of classical formal logic can be formulated as those of intuitionistic logic.


1931: Kurt Gödel publishes "On Formally Undecidable Propositions of "Principia Mathematica" and Related Systems".


1933: Andrey Kolmogorov publishes Foundations of the Theory of Probability, laying the modern axiomatic foundations of probability theory.


1936: Alonzo Church and Alan Turing publish independent papers showing that a general solution to the Entscheidungsproblem is impossible.  Church’s paper introduces the Lambda Calculus and Turing’s paper introduces ideas that become known Turing Machines.


1938 Konrad Zuse builds the Z1, a binary electrically driven mechanical calculator. It is the first freely programmable computer which uses Boolean logic and binary floating point numbers.


1943 Warren McCulloch and Walter Pitts publish A Logical Calculus Immanent in Nervous Activity (pdf) making significant contributions to the study of neural network theory, theory of automata, and the theory of computation and cybernetics.


1944: The first Colossus computer, designed by British engineer Tommy Flowers, is operational at Bletchley Park. It was designed to break the complex Lorenz ciphers used by the Nazis during WWII and reduced the time to break Lorenz messages from weeks to hours.


1945: Samuel Eilenberg and Saunders Mac Lane introduce categories, functors, and natural transformations as part of their work in algebraic topology.


1945: John von Neumann publishes the "First Draft of a Report on the EDVAC" the first published description of the logical design of a computer using the stored-program concept which came to be known as the von Neumann architecture.

1945: Grace Hopper records the first actual computer "bug" - a moth stuck between relays.


1946: J. Presper Eckert and John Mauchly found Mauchly Computer Corporation, the first commercial computer company.


1947: John Bardeen, Walter Brattain, and William Shockley invent the first point-contact transistorJohn R. Pierce, supervisor of the team, coins the term transistor a portmanteau of the term "transfer resistor".


1947: Freddie Williams and Tom Kilburn develop the Williams–Kilburn tube the first random access memory, a cathode ray tube used as a computer memory to electronically store binary data.


1948: Claude Shannon publishes "A Mathematical Theory of Communication" as a two part series in the Bell System Technical Journal describing information entropy as a measure for the uncertainty in a message which essentially invents the field of information theory.


1948: Bernard M. Oliver, Claude Shannon and John R. Pierce publish "The Philosophy of PCM" after the application for the patents by Oliver and Shannon in 1946 and by John R. Pierce in 1945, both patent applications were of the same name: "Communication System Employing Pulse Code Modulation".  All three are credited as the inventors of PCM.


1948: The world's first stored-program computer, Manchester Small-Scale Experimental Machine (SSEM), nicknamed Baby, runs its first program.


1951: David A. Huffman devises Huffman encoding for an assigned term paper for the problem of finding the most efficient binary code by his MIT information theory class professor Robert Fano.


1952: Tom Watson, Jr. becomes president of IBM and shifts the company’s direction away from electromechanical punched card systems towards electronic computers.

1953: Jay Forrester installed magnetic core memory on the Whirlwind computer.


1956: Stephen Cole Kleene describes regular languages using his mathematical notation called regular sets thereby inventing regular expressions.


1956: Wen Tsing Chow invents Programmable Read Only Memory (PROM) at American Bosch Arma Corporation at the request of the United States Air Force to come up with a more flexible and secure way of storing the targeting constants in the Atlas E/F ICBM's airborne digital computer.


1956: IBM introduces the IBM 305 RAMAC the first commercial computer using a moving-head hard disk drive (magnetic disk storage) for secondary storage.


1956: William Shockley moves back to California to be close to his aging mother in Palo Alto and starts Shockley Semiconductor Laboratory in a small commercial lot in nearby Mountain View which sets the agrarian area with cheap land and populated with fruit orchards on a path to later become known as Silicon Valley.


1957: Robert Noyce, Gordon Moore and six others known as the "traitorous eight" leave Shockley Semiconductor Laboratory to start Fairchild Semiconductor.


1957: Noam Chomsky publishes Syntactic Structures which includes ideas of the phrase structure grammar and a finite state grammar, communication theoretic model based on a conception of language as a Markov process.


1957: John Backus creates the first FORTRAN compiler at IBM.


1957: Frank Rosenblatt invents the Perceptron at the Cornell Aeronautical Laboratory with naval research funding. He publishes "The Perceptron: A Probabilistic Model for Information Storage and Organization in The Brain" in 1958 . It is initially implemented in software for the IBM 704 and subsequently implemented in custom-built hardware as the "Mark 1 Perceptron".


1958: John McCarthy develops Lisp which is based on Alonzo Church’s Lambda calculus. It also includes the new idea of a conditional if statement.

1958: Jack Kilby develops the integrated circuit made from germanium and gold wires at Texas Instruments.


1959: Jean Hoerni develops the planar transistor and the planar manufacturing process at Fairchild Semiconductor.


1959: Robert Noyce develops a more practical silicon version of the integrated circuit which has the "wires" between components built directly into the substrate.


1959: Noam Chomsky publishes "On Certain Formal Properties of Grammars" which describes the Chomsky Hierarchy which includes its relation to automata theory.


1959: John Backus publishes "The Syntax and Semantics of the Proposed International Algebraic Language of Zürich ACM-GAMM Conference". In the UNESCO Proceedings of the International Conference on Information Processing, which introduces a notation to describe the syntax of programming languages specifically context-free grammars, the notation later becomes known as the Backus Naur Form (BNF).


1959: Arthur Samuel defines machine learning as a "Field of study that gives computers the ability to learn without being explicitly programmed".

1962: The Radio Sector of the EIA introduces RS-232.


1963: American Standards Association publishes the first edition of ASCII - American Standard Code for Information Interchange.


1964: The CDC 6600 is released. Designed by Seymour Cray, it is dubbed a supercomputer.


1964: MIT, General Electric and Bell Labs form a consortium to develop of the Multics Operating System.


1964: John G. Kemeny and Thomas E. Kurtz create the BASIC language to enable students in fields other than science and mathematics to use computers.


1965: Gordon Moore describes a trend in the paper "Cramming more components onto integrated circuits" which becomes known as Moore’s Law.


1966: Robert Dennard invents DRAM at the IBM Thomas J. Watson Research Center.


1968: Douglas Engelbart gives "The Mother of All Demos", a demonstration that introduces a complete computer hardware/software system called the oN-Line System (NLS). It demonstrates many fundamental elements of modern personal computing: multiple windows, hypertext, graphics, efficient navigation and command input, video conferencing, the computer mouse, word processing, dynamic file linking, revision control, and a collaborative real-time editor (collaborative work).


1968: Edsger Dijkstra´s publishes "GO TO considered harmful" letter, in Communications of the ACM helping pave the way for structured programming.

1968: The Apollo Guidance Computer makes its debut orbiting the Earth on Apollo 7. The Cold War and the Apollo program drive the demand for lighter more reliable computers using integrated circuits.


1968: ARPA approves Bob Taylor’s plan for a computer network.  Bob Taylor along with Ivan Sutherland were originally convinced of the idea by J. C. R. Licklider who initially referred to the concept as an "Intergalactic Computer Network".  Ivan Sutherland and Bob Taylor envisioned a computer communications network, that would in part, allow researchers to use computers provided to various corporate and academic sites by ARPA to make new software and other computer science results quickly and widely available.  The project became known as Arpanet.


1968: Arthur C. Clarke and Stanley Kubrick release 2001: A Space Odyssey a science fiction movie that features advanced computing technology including an AI computer named the HAL 9000 with a voice interface that manages ship operations. The astronauts use multimedia tablet technology (NewsPad) to catch up on family and worldwide events from Earth.


1968: Gordon E. Moore and Robert Noyce found Intel.


1969: Willard Boyle and George E. Smith invent the charge-coupled device (CCD) at AT&T Bell Labs.


1969: Ken Thompson creates his game Space Travel on the Multics Operating System.


1969: Bell Labs pulls out of Multics consortium.

1970: Ken Thompson and Denis Ritche create a new lighter weight "programmers workbench" operating system with many of the robust features of Multics.  The new operating system runs its first application: Space Travel.  Denis Ritche coins the name Unics playing on the Multics name, later it becomes known as Unix.


1970: Edgar F. Codd publishes "A Relational Model of Data for Large Shared Data Banks" which establishes a relational model for databases.


1970: Xerox opens Palo Alto Research Center (PARC) which attracts some of the United States’ top computer scientists. Under Bob Taylor’s guidance as associate manager it produces many groundbreaking inventions that later transform computing, including Laser printers, Computer-generated bitmap graphics, The WYSIWYG text editor, Interpress, a resolution-independent graphical page-description language and the precursor to PostScript, Model–view–controller software architecture.

1971: IBM introduces the first commercially available 8 inch floppy diskette.


1971: Intel introduces the first commercially available microprocessor the 4-bit 4004.


1971: Raymond Tomlinson implements the first email system able to send mail between users on different hosts connected to the ARPAnet.  To achieve this, he uses the @ sign to separate the user from their machine, which has been used in email addresses ever since.


1972: Peter Brody's team at Westinghouse produces the first active-matrix liquid-crystal display panel.

1972: Hewlett-Packard introduces the HP-35 the world's first scientific pocket calculator.


1973: Eckert and Mauchly Eniac patent is invalidated.


1973: Robert Metcalfe devises the Ethernet method of network connection at the Xerox Parc.


1973: Gary Kildall develops CP/M which stands for "Control Program/Monitor".


1974: Vint Cerf and Bob Kahn publish a paper titled "A Protocol for Packet Network Intercommunication" describing an internetworking protocol for sharing resources using packet-switching among the nodes which was henceforth called the Internet Protocol Suite, and informally known as TCP/IP.


1975: The January edition of Popular Electronics featured the MITS Altair 8800 computer kit, based on Intel´s 8080 microprocessor, on its cover.


1975: Bill Gates and Paul Allen found Microsoft to develop and sell BASIC interpreters for Altair 8800.


1975: Softlab Munich releases Maestro I the world's first integrated development environment (IDE).


1975: Frederick Brooks publishes The Mythical Man-Month: Essays on Software Engineering is a book on software engineering and project management


1976: Kenneth Appel and Wolfgang Haken prove the Four Color Theorem by computer, the first major mathematical proof to be proven by a computer.


1977: Ron Rivest, Adi Shamir, and Leonard Adleman publicly describe the RSA cryptosystem, the first practicable public-key cryptosystem.


1977: Apple introduces the Apple II, designed primarily by Steve Wozniak,  it is one of the first highly successful mass-produced microcomputer products.

1977 Atari, Inc releases The Atari 2600, a microprocessor-based hardware video game console with ROM cartridges containing game code.


1978: Brian Kernighan and Dennis Ritchie publish "K&R" establishing a de facto standard for the C programming language, later in 1989 the American National Standards Institute published a standard for C (generally called "ANSI C" or "C89").  Development of C was started by Dennis Ritchie and Ken Thompson in 1969 in conjunction with the development of the UNIX operating system.


1978: University of North Carolina at Chapel Hill and Duke University publicly establish Usenet.


1979: Dan Bricklin and Bob Frankston found Software Arts, Inc., and began selling VisiCalc.


1980: Smalltalk (Smalltalk80) a Xerox Parc project led by Allan Kay since the early 1970’s is released to the public.


c. 1980: Fujio Masuoka while working for Toshiba invents both NOR and NAND types of Flash memory


1981: IBM introduces the IBM PC.

1984: Apple Computer launched the Macintosh, Based on the Motorola 68000 microprocessor, it is the first successful mouse-driven computer with a graphic user interface.


1984: Abraham Lempel, Jacob Ziv, and Terry Welch publish the Lempel–Ziv–Welch (LZW) compression algorithm.


1985: Bjarne Stroustrup publishes The C++ Programming Language.


1989: Tim Berners-Lee proposes the "WorldWideWeb" project and builds the first Web Server and Web Client using HTTP and HTML all of which lays the foundation for what becomes the World Wide Web.


1991: Linus Torvalds releases Linux to several Usenet newsgroups.


1991: Eugenio Moggi publishes "Notions of Computation and Monads" (pdf) which describes the general use of monads to structure programs.

1992: The first JPEG standard and the MPEG-1 standard are released.


1993: Marc Andreessen and Eric Bina release the Mosaic Web Browser.


1994: Jeff Bezos founds Amazon.com.


1994: Erich Gamma, Richard Helm, Ralph Johnson and John Vlissides, known as the "gang of four" publish Design Patterns: Elements of Reusable Object-Oriented Software.  Subsequently in 1999 are found guilty by The International Tribunal for the Prosecution of Crimes against Computer Science by a two thirds majority.


1995: NSFNET is decommissioned, removing the last restrictions on the use of the Internet to carry commercial traffic.


1996: Larry Page and Sergey Brin develop the PageRank algorithm and found Google two years later.


1997: Nullsoft releases Winamp.


1998: Diamond Multimedia releases The Rio PMP300, the first commercially successful portable consumer MP3 digital audio player.


1999: Napster is released as an unstructured centralized peer-to-peer system.


1999: Darcy DiNucci coins the term Web 2.0 to describe World Wide Web sites that use technology beyond the static pages of earlier Web sites including social networking sites, blogs, wikis, folksonomies, video sharing sites, hosted services, Web applications, and mashups.. The term is later popularized by Tim O'Reilly at the O'Reilly Media Web 2.0 conference in late 2004.


1999:  The Wi-Fi Alliance formed as a trade association to hold the Wi-Fi trademark under which most products are sold.


2001: Jimmy Wales and Larry Sanger launch Wikipedia.


2004: Google acquires Sydney-based company Where 2 Technologies and transforms its technology into the web application Google Maps.


2006: Amazon launches Amazon Web Services.


2007: Apple releases the iPhone setting the design of the modern smart phone.


2010: Apple releases the iPad shaping tablet computer technology that started over 20 years prior to its current character.


2010: Siri is released by Siri Inc on iOS and subsequently acquired by Apple.


2012: The LHC Computing Grid becomes the world's largest computing grid, comprising over 170 computing facilities in a worldwide network across 36 countries.

2012: The Raspberry Pi Foundation ships the first batch of Raspberry Pi’s, a credit-card-sized single-board computer.


2013: The Univalent Foundations Program, Institute for Advanced Study organizes 40 authors to release The HoTT Book on Homotopy Type Theory.


If you made it this far you’ve survived my brief, relatively speaking, trip though five millennia of the human history of computing.  It was hard to pick more recent things since we don’t know what the important events for the future are.  I chose to end with Homototpy Type Theory since it is supposed to be the next big thing.  Also you might have noticed that my title is similar, perhaps an homage, or just a rip off of:  "A Brief, Incomplete, and Mostly Wrong History of Programming Languages".  I had this idea prior to seeing his and for a long time it deterred me from writing this, but I finally got over the perception that he did it first since my original idea was quite different.


If you are still hungry for more including various references and influences on this post:


History of Computing

Columbia University Computing History

List of IEEE Milestones

History of Lossless Data Compression Algorithms

John von Neumann, EDVAC, and the IAS machine

Computer History Museum Timeline of Computer History

Wikipedia History of Computing

Wikipedia Timeline of Computing

PBS American Experience: Silicon Valley

PBS Nova’s Decoding Nazi Secrets

James Burke’s The Day the Universe Changed

23 July 2014

A Survey of Graph Theory and Applications in Neo4J


A while ago I had the idea to do a graph theory talk at our local Neo4J Meetup, the Baltimore Washington Graph Database Meetup, especially since Neo4J and graph databases are based on graph theory.  So I finally sat down and spent quite a bit of time putting a talk together. I decided to go with the idea of an overview of graph theory with a primary focus on the theoretical aspects and with some applications like network theory and some specific examples of applications to Neo4J.  My talk is titled: "A Survey of Graph Theory and Applications in Neo4J" and I presented it on June 24, see below for video, slides, and references. This post is my thoughts on it and things leading up to its creation.

Some of my blog posts are created from a desire to make math more accessible and more relevant especially those areas of math that seem to be neglected by the traditional educational system.  One huge gaping hole in my math education both in our American institutional learning facilities and in my college computer science curriculum was that I was never exposed to graph theory.   Once you start studying graph theory you realize that many of these concepts could be taught in middle and elementary schools.  As I was preparing my presentation, I saw this: "Math for seven-year-olds: graph coloring, chromatic numbers, and Eulerian paths and circuits", reaffirming what I already knew.  Sadly many of these concepts were new to me as an adult, and I’d guess the same was probably true of much of my audience.  

Graph theory has been a topic that I have included in my blog, I have written two posts about it one is a general math post: "Triangles, Triangular Numbers, and the Adjacency Matrix" and the other post: "The Object Graph" attempts to put it in part in a programming context .  One post or series of posts that I have wanted to do is an introduction/overview of graph theory.  I was thinking that my talk could be turned into those posts.  However, I am not sure if and when I will get a chance to do that, so I figured until then I will make the talk itself that post and that post is this post.

I found myself torn about publishing the video of my presentation, on one hand I feel that it could be much better but on the other, as a friend pointed out, people might find it informative and enjoy watching it.  That being said, I reserve the right to replace it with a better version if I ever do one.

Writing a post about the talk provides me with an opportunity make a few corrections, add some more source material and mention a couple of things that I would have liked to include but that got cut in the interest of time.

I’ll start by adding some additional source material.  I followed the introductory remarks with a brief historical overview.  For that, especially 20th century contributors I made use these power point slides of John Crowcroft’s "Principles of Communications" course.  I also referenced Graph Theory, 1736-1936 by Norman L. Biggs, E. Keith Lloyd, and Robin J. Wilson.  I mentioned the Watts and Strogatz Model (pdf of original nature paper).  I also included Fan Chung who is married to Ronald Graham and is known for her work in graph theory especially spectral graph theory.   I mentioned snarks not to be confused with Lewis Carroll‘s snarks from which he coined the name.  When I was talking about the topology part I mentioned Eulers Gem.  Since the World Cup was going on during the talk I have to include these topology links "The Topology and Combinatorics of Soccer Balls(pdf)" and some more "soccer ball math".

Due to time constraints I cut a few topics,  two of them were the ideas of graph isomorphisms and graph automorphisms, also ideas like: Graph families defined by their automorphisms and the automorphism group.  Another area that I didn’t really get into, except the linear algebra parts was the area of algebraic graph theory.

I did want to offer up a couple of corrections: I called a truncated icosahedron a dodecahedron, how embarrassing. I called the Travelling Salesman Problem, the Travelling Salesman Program, doh! When I said a bipartite graph has no connections between edges, it should have been vertices.  And finally I misstated Kirchoff’s Theorem, it should be n-1 non-zero Eigenvalues, obviously you only use the non-zero ones!  There are probably more and if I decide to replace these videos in the future, I will just replace this paragraph as well.

On a lighter note, when I was creating my "Graphs are Everywhere" images and sharing with my friend Wes, I created this one for fun:


Finally I’d like to thank our meetup sponsor neotechnology refreshments. The part where we thanked them at the meetup didn’t get recorded.


Videos


Part 1




Part 2





References and Further Reading



15 March 2014

Are Your Developers Treating Your Project as Their Own Personal Science Project?


Years ago I worked on a project that had a custom role based security system and for this system we needed a way to load menus according to the users’ assigned roles.  Two of the dominant developers on the project decided that this was a perfect use for AJAX mainly because they wanted to learn and use AJAX.  So they build a separate deployable menu application to serve up the menu by emitting XML from a Spring Controller URL, it was very comparable to a simple Restful service. It was called directly from JavaScript in the application’s web page.  Since there would be a lot of calls for any given user’s menu they built a cache for each user’s menu options, it was a custom cache where objects never expired.  This work was completed as I joined the project and as I got acquainted with this project my first thought was why do all this work?  Why not just load the menu when the user is authenticated and then cache it into the HTTP Session, I raised the issue but was ignored.  It was their pet project and it was sacrosanct. 


Later, while testing the apps that used the AJAX menu call the testers ran into a problem, the cache was interfering with testing because it was keeping the database changes from be propagated and testing the applications created a need to continuously restart the AJAX menu app.  So a new admin interface was added to the AJAX menu application to allow the testers to clear the cache.  This created a whole separate application that the client would have to support just to enable this unnecessary AJAX menu delivery mechanism.  So instead of just caching the data in the HTTP Session which is easier and any cache problem would be fixed by just logging out and logging back in, these two developers built a pet project which  consists of an extra application and codebase that has to be understood and maintained.  Ultimately the biggest problem this creates boils down to simple economics, it more expensive to maintain a solution that requires a whole separate application.


I think there is a tendency for all developers both good and bad to sometimes want to make projects more interesting and there can be a lot of temptations.  It is often tempting to create a custom implementation of something that could be found already written because it’s a fun project or the temptation to shoehorn a new hot technology into a project or even the temptation to add a solution to a problem that doesn’t need to be solved or just strait up over engineering.    I am guilty of some of these transgressions myself over the years. 


My anecdote about the menu app is just one of at least a dozen of egregious examples of these types of software project blunders that I have seen over the years.  My anecdote shows a manifestation of this problem and it also shows that these types of software project issues have a real tangible cost.  They create more code and often more complex code.  In some cases they create the need for specialized technical knowledge that wasn’t even necessary for the project.  While these approaches may only result in a small cost increases during the construction of the software the real consequence will be the long term maintenance costs of a software system that was needlessly overcomplicated.   In some cases these costs will be staffing costs but they can also be time costs in that changes and maintenance take considerably longer.  These problems may also result in costs to business as unwieldy systems cause the loss of customers to competitors and can impede the ability to gain new customers.


These types of decisions can be viewed as software project blunders.  In some cases the developers know better but choose to be selfish in that they put their own desires above the project, in these cases these decisions can potentially be viewed as unethical.  The two developers from the anecdote were extreme examples and were pushing into unethical territory since they treated every new project as a personal opportunity to use some new technique or library or some custom idea that they were interested in.  However, I think most of these blunders are simply that, mistakes in judgment or lack of good oversight.  It is always easier in hindsight to see these problems, but when one is in the heat of the moment on the project one can lose perspective.  The real challenge is trying to avoid these blunders. Really these are problems that fall into a larger spectrum of software project management problems and sadly I have mostly seen terrible software project management throughout my career.


I wish I could say I have a solution to this problem.  I think developers have some responsibility here especially senior developers, well at least the good ones.  Software development is really about decisions.  Developers make many decisions every day including naming, project structure, tools, libraries, frameworks, languages, etc.  Too often these decisions are made in isolation without any review and sometimes they can have serious long term consequences.  When making so many decisions in such a complex domain mistakes are inevitable.  Good software developers and good teams make an effort to deliberate and review their choices.