Every revolution begins with ideas. The Digital Revolution is no different. The influence of these ideas on the most mundane practices in society is often forgotten. What are mere utilities and commodities for us today were remarkable innovations long ago. The technologies we consume and enjoy would not exist without the seminal proposals of great thinkers. Here is a brief survey of seven earthshaking manifestos from the annals of computing history.

“Notes from the Translator,” Ada Lovelace, 1843

Ada Lovelace did for Charles Babbage what he could not do alone. She created a philosophical framework for a monumental effort in technology, and she proposed concepts for data processing that were on the forefront of digital computing. Not only did she introduce Babbage's Analytical Engine to the scientific community, she also presented concepts that foreshadowed modern computer programming.

Attached as an addendum to her translation of L.F. Menabrea's “Sketch of The Analytical Machine,” Ada's “Notes” included what some consider to be the first computer program (in Note G, a table called “Diagram for the computation by the Engine of the Numbers of Bernoulli"). She considered how computing might do much more than just tabulate. The engine would be “algebraical and analytical,” working with any subject. Ada laid the foundation for a general-purpose computing machine.

“On Computable Numbers,” Alan Turing, 1936

Alan Turing's work is an exploration of the capabilities and limitations of computing. He proposed that “it is possible to invent a single machine that could compute any computable sequence.” Following a set of rules (what we call an algorithm or a program), an "a-machine" (automatic machine) would be able to solve the most complicated of problems.

The “Turing Machine” was a mathematical model created to address the Entscheidungsproblem, or “decision problem,” of David Hilbert. Is it possible to determine whether a mathematical assertion is provable before attempting to solve it? Lying in a meadow on a break from one of his long runs, Turing conjured up a figmentive machine to process the problem.

The thought experiment involved an unlimited length of paper tape and a movable head. On the tape were symbols within squares. Based on current machine states and a set of instructions, the head would read the symbols within the squares, write to them, and then move to the left or the right. The solution is what is written on the tape once the machine halts.

Turing's "a-machine" could manipulate a string of ones and zeros to solve problems, no matter how complex. This machine, it was found, could complete any calculation of computable numbers. The demonstration of the Turing machine's ability to control machine functions by coded instructions provided the basis for modern computing.

“First Draft of a Report on the EDVAC,” John von Neumann, 1945

John von Neumann is credited with the “Von Neumann Architecture” presented here. The reader will recognize that the plan is used in modern computers, but with renamed components. The divisions of the architecture were these:

  1. Central Arithmetic Part: CA
  2. Central Control Part: CC
  3. Memory: M
  4. Input: I
  5. Output: O
  6. Recording Media: R

Von Neumann wanted the computer to imitate the neurons of the brain. He wrote about “excitatory synapse” and “inhibitory synapse” and “synaptic delay,” and that “simplified neuron functions can be imitated by telegraphic relays or by vacuum tubes.” His purpose was to describe "a very high speed automatical digital computing system.” The document caused so much excitement that it was distributed worldwide before any final draft was attempted.

“As We May Think,” Vannevar Bush, 1945

The editor of the Atlantic wrote that “this paper by Dr. Bush calls for a new relationship between thinking man and the sum of our knowledge.” In fact, he proposed that human reason can be extended by scientific means. “It is readily possible to construct a machine which will manipulate premises in accordance with formal logic, simply by the clever use of relay circuits.”

Bush noted that science had increased our control of the material environment and improved communication between individuals. Yet specialization made it more difficult to bridge between disciplines. He was concerned about managing the “mountain of research” produced by scientists of various fields. How could a record be stored and continuously extended so that it becomes useful to the larger community? “There will always be plenty of things to compute in the detailed affairs of millions of people doing complicated things,” he wrote.

Far ahead of his time, Bush postulated a “memex” machine that would incorporate many aspects of what we know today as the personal computer. He described how a device for individual use could store all sorts of data, including books, pictures, newspapers and periodicals. Man's complex civilization would benefit from mechanized records to help him solve problems. In the course of his paper, Bush foreshadowed technologies that are common today, such as hypertext, file sharing, the internet, personal computing and speech recognition.

“Man-Computer Symbiosis,” J.C.R. Licklider, 1960

Licklider combined his expertise in psychology and mathematics to produce this seminal treatise on interactive computing. His hope was that “human brains and computing machines will be coupled together very tightly.” He saw this symbiosis as an interim period, something between mechanically extended man and artificial intelligence, when advances would be made by man and machine working together, despite differences in speed and language.

He discussed various possibilities for input and output, including a desk-surface display and control, a wall display, and automatic speech production and recognition. He also wrote about “indelible memory” and “published memory” (today's RAM and ROM). Licklider envisioned a “thinking center” that would be available for data storage and retrieval. These were concepts of the computer revolution.

“Augmenting Human Intellect: A Conceptual Framework,” Doug Engelbart, 1962

Englebart presented his vision of the power of computers to extend human intellectual capabilities. Man's problems are becoming more complex, so he called for augmentation of mental processes. His was a system approach. He saw that man's intellect includes concept manipulation, symbol manipulation and manual external symbol manipulation. The next step in the evolution would be automated external symbol manipulation: a computer that would execute processes in automatic response to human direction.

"Augmentation is fundamentally a matter of organization." Through the structuring of concepts and symbols and processes, it would be possible to improve intellectual power. He commended Vannevar Bush for his ideas on the augmentation of the individual intellectual worker, and he made an open plea to researchers to contribute to the discipline of intellect augmentation.

“Spacewar,” Steward Brand, 1972

At the hippie-hacker intersection of cultures, Steward Brand writes in Rolling Stone magazine about how computing was evolving. “Ready or not, computers are coming to the people. That's good news, maybe the best since psychedelics.” He tells the story of the development of Spacewar, one of the earliest video games. Les Earnest, the director of Stanford's Artificial Intelligence (AI) Laboratory, said that “sometimes it's hard to tell the difference between recreation and work, happily.”

In relaxed environments (such as the Bean-Bag Room at Xerox), revolutionaries of technology and culture intermingled and created. Much of it was after hours: “Back then it was just kids staying up all night.” Spacewar was a “flawless crystal ball of things to come,” according to Brand. It was interactive, bonded human and machine, functioned on stand-alone equipment, served human interest – and it was a “delightful” game.

Brand wrote about “the freedom to explore” in the new tech culture. He envisioned music swapping and online news services that would do away with newspapers and record stores. He talked about computers bringing people together in a common cause. “I think it's important to bring computing to the people,” he said.

Conclusion

A glimpse at these important writings makes it clear that the development of digital computing was not inadvertent. Considerable thought preceded advancements, and the ideas that were disseminated gave impetus to further progress. Before application, there is invention, and before invention, there are ideas.