We use computers every day — in the office, at home, on the go. We exploit them for productivity, for entertainment, for communication. We tap on them at our desks, carry them in our hands or make use of them in our appliances. Recognizing the achievements that have led to today's digital environment, this article discusses some selected milestones in computing history.
The Engines of Charles Babbage
We generally think of the computer as a 20th century invention. In the broadest terms, computing has been around for thousands of years. From clay tokens to the abacus, tradesmen have used various methods for counting and calculations. Then, with the engines of Charles Babbage, computing made a giant design leap. Using the “science of operations,” machines would do much more than just tabulate.
Confounded by a host of errors in the mathematical tables of the Nautical Almanac, the student Charles Babbage cried out to his colleague, “I wish to God these calculations had been executed by steam!” Babbage dared to contemplate the idea that practical mathematics could be accomplished by mechanical means. Moving forward on a bold project to implement his vision, Babbage introduced his Difference Engine in 1822 at an Astronomical Society meeting. He soon ran into problems. The design called for some 25,000 handmade mechanical parts. Production delays and a contractual dispute with his chief engineer killed the project.
Babbage's next effort was the Analytical Engine, a general-purpose computing machine that would use punch cards, borrowing technology from the silk-weaving industry. But the government had lost patience with the inventor's innovations and were unwilling to fund the project. Ada Lovelace, daughter of Lord Byron, made tremendous contributions to computing in her published notes about the machine. Never finished, the Analytical Engine design marked a transition in digital computing, demonstrating that machines could be tasked with much more than simple numerical operations.
The Turing Machine
It all started as a thought experiment while Alan Turing was lying on his back in a meadow, scanning the sky and exploring great possibilities. He turned his imagination to the “decision problem” of David Hilbert, which asked whether it was possible to determine whether a particular problem was solvable. He wondered whether a “mechanical process” might address the issue.
Turing envisioned a machine that could perform calculations on an endless ribbon of paper. He determined that, by use of the symbol 1 in conjunction with a blank, it would be possible for the machine to complete any mathematical assignment on “computable numbers.” The Turing Machine (a theoretical device that was never actually built) demonstrated the tremendous power of computational devices to deal with great complexities. “It is possible to invent a single machine which can be used to compute any computable sequence,” Turing wrote.
Von Neumann and the Stored Program Computer
A major step forward in computing, the architecture proposed by John von Neumann provided that program instructions would be stored in memory. In a von Neumann computer, processing and storage units are separate, and programs and data are stored and retrieved on the same memory unit. In today's terms, the central processing unit (CPU) obtains its instructions from programs on a storage disk. It also reads and writes to data files on the same storage disk.
John Mauchley, when writing about his projects, said that “there would be only ONE storage device (with addressable locations) for the ENTIRE EDVAC….” The stored-program design architecture of von Neumann, according to some estimates, became the incarnation of the Turing Machine — with limitless possibilities. Soon the dream of a general-purpose computational machine would become reality.
UNIVAC Makes Payroll
“The Utopia of automatic production is inherently plausible,” wrote Theodore Callow in "The Sociology of Work." Mauchly and J. Presper Eckert offered supporting evidence for this conclusion when, on Friday, October 15, 1954, history's first automated payroll checks were printed. The tasks for General Electric's UNIVAC were mundane: inventory, order management, accounting, as well as payroll. This Friday payroll was a clear demonstration of digital computing's potential for commercial applications.
Mauchly and Eckert had proven themselves as innovators. The ENIAC and EDVAC are legendary examples of pioneer achievement in the field. But those early efforts were focused on government, military and academic projects. Here was a major milestone in the increasing contributions of the computer to commercial enterprise and society in general.
IBM's “Professor RAMAC”
As computing progressed, engineers recognized the need for better ways to manage and access data. The Model 305 Disk Storage unit, or RAMAC (Random Access Memory Accounting Machine), was the answer. Rotating at 1200 rpm, 24 inches in diameter, it used a stack of fifty aluminum disks and stored five million characters. “Random access” meant that any piece of data was accessible upon command. (To get a sense of what technology was like at that time, check out This Is What a 5MB Hard Drive Looked Like In 1956.)
The IBM president was thrilled to introduce the machine to the world at the 1958 World's Fair in Brussels. Visitors could miraculously query “Professor RAMAC” through a keyboard and receive answers in any of ten languages. The glorious event was heralded by IBM's president as “the greatest product day in the history of IBM.”
The Inventors of the Integrated Circuit
It is not unheard of for a great innovation to be made by two separate inventors at roughly the same time. That's what happened with Jack Kilby and Robert Noyce.
Four components were required to make computer circuits functional: transistors, resistors, diodes, and capacitors. Working independently, these technology pioneers discovered that it was possible to unify these functionalities in a single component: the integrated circuit. To make it work, they found that they could print electrical pathways onto silicon oxide coating.
Despite a long court battle, the two innovators eventually decided to share the patent. Noyce went on to form Intel. Both men would receive the National Medal of Science — Kilby in 1969 and Noyce in 1979. Kilby won the Nobel Prize for the invention in 2000, and gave proper credit to Noyce in his acceptance speech.
Steve Wozniak's Video Screen
Calling himself “The Woz,” Steve Wozniak in the 1970s was also known as a serial prankster and a college dropout. Now we know him as a genius. (Or was it his partner Steve Jobs who was the genius? Wozniak's father cursed Jobs and said that his son had done all the work — bringing Jobs to tears, according to some accounts.) But “The Woz” didn't come to innovation all on his own. He attended the first meeting of the Homebrew Computer Club, a gathering of the hippie-hacker culture that had developed in the in the San Francisco Bay area.
A designer of video terminals, Wozniak realized after the meeting that he could put the power of the microprocessor to work in ways that others had overlooked. Capitalizing on his insight, he quickly developed a standalone computer that responded to keyboard input. At 10:00 p.m. on Sunday, June 28, 1975, Wozniak typed on his keyboard and letters appeared on the screen. The Apple personal computer was born. The dreams of America's electronic hobbyists were becoming reality, and the computing industry would never be the same. (For more on Apple and its development through the years, see Creating The iWorld: A History of Apple.)
Key innovations like these have had great influence on subsequent development in computing. The digital environment that we use today is the result of a cumulative effort of large teams as well as individual genius. These milestones are noteworthy among the many contributions in the field.