While technology companies may hype the latest thing, if you pay careful attention to the history of tech, you’ll often find that there’s really nothing new under the sun.
While cloud computing might be all the rage these days, the idea of computer resources that you access remotely and are as reliable as the electric company goes back to the '60s.
MULTICS was an attempt to build a "computing utility" that anyone could use at any time. It was a joint project with General Electric, MIT and Bell Labs. The MULTICS project had some innovative ideas, such as a lack of distinction between files and process memory, dynamic linking, user-replaceable shells and being implemented in a high-level language.
The project might have been too ambitious in the late '60s, and Bell Labs pulled out, but not before it influenced two of its programmers, Ken Thompson and Dennis Ritchie. You might recognize them as the creators of Unix, which implemented many of MULTICS’ ideas. Honeywell eventually did commercialize MULTICS. The last known installation was shut down in 2000 in the Canadian Department of Defence in Halifax.
Alan Kay is quoted as saying that "the best way to predict the future is to invent it." He seems to have followed his own advice. Working at the Xerox Palo Alto Research Center (PARC), he helped develop many of the modern computing elements we use today: Ethernet, the graphical user interface and object-oriented programming.
His most lasting contribution might be the Dynabook, a small, portable computer concept designed for education. Devised in the early '70s, it looks a lot like the modern tablet computers that have finally gone mainstream almost 40 years later.
While Dynabook-like devices are on the market, the key feature for Kay is the educational curriculum that would let children explore the world, rather than just rely on teachers to spoon-feed them. In this case, the Dynabook is not quite there.
We’re not all business here at Techopedia. We like to have fun occasionally. And one of our favorite ways to kick back involves video games, of course. You might think that one of the earliest video game might be "Pong," back in 1972.
Video games actually go back to the 1950s, with "Tennis for Two," constructed with an analog computer using an oscilloscope for a display. Ralph Baer, whose ideas were used for the Magnavox Odyssey, the first home video game console, is widely considered to be the father of video games. The Odyssey was purely analog. It didn’t even keep score.
The first modern digital video game that we would recognize as a video game would be "Spacewar," developed in 1962 by Steve Russell at MIT. The game involved two ships flying around a star, shooting at each other. The physics were accurate. The ships were pulled into the gravitational field of the star, while behaving according to Newtonian laws in space. It also had accurate placement of the stars.
The game showed off what interactive computing could do and became a hit, being ported to nearly every minicomputer. The game got a boost from a 1972 Rolling Stone article by Stewart Brand (yes, the same guy who created the Whole Earth Catalog) about some players at the Stanford Artificial Intelligence Lab.
One of the people inspired by "Spacewar" was Nolan Bushnell. He created an arcade game called "Computer Space" in the early '70s, but most people found it too complicated, so he ended up creating a little game called "Pong," which also launched Atari. A few years later in 1979, the company released "Asteroid," which had a very similar play style to "Spacewar," becoming one of the industry's most beloved games.
While you might post your important (and not-so-important) status updates to Facebook, or even play games, you might be surprised that the idea goes back to the 1970s. Here's a terminal from 1981 with an orange plasma screen, also ahead of its time:
PLATO (Programmed Logic for Automatic Teaching Operation) was developed by the mainframe maker Control Data Corporation as the first computer-aided instructional system for college campuses. It prefigured all kinds of ideas: email, instant messaging, message boards and even multiplayer games. CDC commercialized it, but it never became a successful project, probably because it was way ahead of its time.
You might have an Android or iPhone in your pocket. Perhaps you were an early adopter of the BlackBerry phones. You might be surprised that smartphones have a much earlier history.
The IBM Simon is credited as the first smartphone, according to an article in Time magazine. Introduced in 1994 as a joint project between IBM and Bell South, it seems laughably primitive today, but it introduced a lot of smartphone features we take for granted.
It had a black-and-white touch screen and access to email, notes and calendar apps, just like modern smartphones. You could even send faxes. (Hey, this was the '90s after all!) It only cost $1,100. If you signed a BellSouth contract, you could get it for the bargain price of $900 in 1994 money. Even worse, the battery lasted less than an hour.
The price eventually dropped to $600. Needless to say, it wasn’t a big seller. It was only on sale for six months, but influenced the design of smartphones that came after it.
Even though the tech we use today might seem brand new, it’s good to remember that it had to start somewhere. Just taking a look back can help inspire the tech of tomorrow.