The History of Unix: From Bell Labs to the iPhone

Why Trust Techopedia
KEY TAKEAWAYS

The fact that Unix is still in use after more than 40 years is a sign of its versatility.

You might think your smartphone or tablet is brand new, but the technology underlying it has a long history dating back to the 1960s. If you have an iOS or an Android device, it’s based on an operating system called Unix that was developed at Bell Labs. Even if you have a PC running Windows, it talks to many servers during the day, many of which are also running on Unix. For its long history, it’s a little surprising that Unix is still so common. Here we’ll take a look at how it came this far.

Early History

The genesis of what eventually became Unix started in the mid 1960s with a project called MULTICS. A consortium of organizations, including MIT, GE and Bell Labs, came together to create a system to support a “computing utility.” Today, we might call it cloud computing. Unfortunately, MULTICS might have been too far ahead of its time back then, and Bell Labs eventually pulled out of the project in 1969, leaving a couple of programmers, Dennis Ritchie and Ken Thompson, stuck on older equipment.

Once Thompson and Ritchie had had a taste of interactive computing when the world still mostly depended on batch processing, they couldn’t go back. So they decided to start their own project, which attempted to save some of MULTICS’ best features.

“What we wanted to preserve was not just a good environment in which to do programming, but a system around which a fellowship could form,” Ritchie wrote in 1979. “We knew from experience that the essence of communal computing, as supplied by remote-access, time-shared machines, is not just to type programs into a terminal instead of a keypunch, but to encourage close communication.”

In addition to those lofty goals, Thompson also had a more personal motive: He wanted to play a game he’d invented called “Space Travel.”

Thompson and Ritchie decided to implement their system on a Digital Equipment Corporation PDP-7. They sketched out a basic system and wrote it in assembly language. They decided to name it “UNICS” as a pun on MULTICS. They soon changed the name to “Unix.”

Advertisements

They wanted a more powerful computer, so they talked the management into buying a PDP-11 to develop a text processing application for Bell Labs’ patent department. As a result, the first end-user application for Unix was essentially word processing.

The success led to Unix’s growth within Bell Labs. One distinctive feature was the ability to redirect input from one program to another, allowing for a “building-block” approach to software development.

The turning point for Unix was when it was re-implemented in C, a language designed by Thompson and Ritchie. C was a high-level language. Writing an operating system this way would have a profound effect on its evolution; it made Unix portable, which meant it could be run on different computers with relatively little effort. (Learn about the history behind programming languages in Computer Programming: From Machine Language to Artificial Intelligence.)

Unix generated a lot of attention when Thompson and Ritchie published a paper on the system in the prestigious computer science journal Communications of the ACM in 1974.

The Berkeley Software Distribution

As popular as Unix was getting inside and outside of Bell Labs, AT&T, of which Bell Labs was the research arm, couldn’t capitalize on it because of a consent decree. In exchange for maintaining a monopoly on phone service in the U.S., it couldn’t enter any non-phone areas of business, namely computer software, but was required to give a license to anyone who asked.

Bell Labs did practically give away copies of Unix, complete with source code, to universities. One of them was UC Berkeley. The inclusion of the source code allowed students, notably Bill Joy, to make changes and improvements. These improvements became known as the Berkeley Software Distribution (BSD).

A number of innovations came out of the BSD project, including the first version of Unix to take advantage of the virtual memory of DEC’s VAX minicomputer line and the vi text editor.

The most important addition was the implementation of TCP/IP, which made Unix, and BSD Unix in particular, the operating system of choice on the nascent Internet. (Learn more about the development of TCP/IP in The History of the Internet.)

Versions based on BSD also became popular on the emerging workstation market, especially on Sun Microsystems computers, which Bill Joy left Berkeley to cofound.

GNU and Linux

Sun wasn’t the only company commercializing Linux. After the break-up of AT&T in the early ’80s, it it was finally able to get into the computer business too. AT&T introduced System V, which was geared toward larger multi-user installations.

But at least one person wasn’t pleased with the way the industry moved from an academic environment where everybody shared source code to a commercial world where people “hoarded” code.

Richard Stallman, a programmer for MIT’s Artificial Intelligence Laboratory, announced the GNU (GNU’s Not Unix) Project in 1983.

“I consider that the Golden Rule requires that if I like a program, I must share it with other people who like it,” Stallman wrote in his GNU Manifesto. “Software sellers want to divide the users and conquer them, making each user agree not to share with others. I refuse to break solidarity with other users in this way. I cannot in good conscience sign a nondisclosure agreement or a software license agreement.”

The GNU Project aimed to replace proprietary Unix software with free software, “free as in speech, not as in beer,” as Stallman put it. In other words, with source code and licensing that actually encouraged people to give it away.

As crazy as this scheme must have sounded, Stallman managed to attract a group of programmers to work on the project, developing high-quality software such as editors, compilers and other tools, all released under licenses (particularly the General Public License (GPL)) that guaranteed access to the source code. The influence of GNU even persuaded the BSD programmers to scrub AT&T code from the system, making it fully redistributable as well.

The final missing piece was the kernel, or the core of the system. The GNU kernel, HURD, turned out to be more difficult to implement than anticipated. Fortunately, one Finnish graduate student’s hobby project turned out to be GNU’s saving grace. Linus Torvald released his Linux kernel in 1991, and though he didn’t intend for it to happen, started a revolution in operating systems. Soon, “distributions” of Linux and GNU tools started popping up, allowing anyone with the requisite skill to have a Unix-like operating system similar to the ones that cost thousands of dollars used in universities and research labs. Best of all, they could do it on an ordinary PC, free of charge. (Read more about today’s popular distributions in Linux Distros: Which One’s Best?)

This was irresistable to the growing number of Web startups and ISPs in the ’90s. They could obtain server software for free and hire bright young computer science graduates who knew how to run them for not very much money either. The Linux/Apache/MySQL/PHP server stack is still one of the platforms of choice for Web service providers today.

Going Mobile

Even though Unix is more than 40 years old, its versatility allows for uses well beyond the original minicomputers it first ran on. One of the most visible is Apple’s iOS, which is partly based on FreeBSD, which itself is based on the original BSD code. The other major mobile OS, Android, is based on a modified Linux kernel. Although neither of these contain original Unix code, they preserve many of the underlying ideas, even under slick visual interfaces that are a far cry from the command line most people associate with Unix.

That the current major mobile platforms are based on Unix shows its versatility. It’s old, but there seems to be no sign it’s slowing down, even though one of its original creators, Dennis Ritchie, passed away in 2011. So next time you want to think of your smartphone or tablet as brand new, think again – the technology that backs it has come a very long way.

Advertisements

Related Reading

Related Terms

Advertisements
David Delony
Contributor
David Delony
Contributor

David Delony is a Bay Area expatriate living in Ashland, Oregon, where he combines his love of words and technology in his career as a freelance writer. He's covered everything from TV commercials to video games. David holds a B.A. in communication from California Sate University, East Bay.