There are some technologies that are introduced, adopted and immediately become game changers – like the iPad. Many, many others, fail spectacularly, never to be heard from again. But there’s another category of failed technologies: Those that are gone but not forgotten. Despite their inability to survive, these technologies end up having a major impact on future technology. Here are a few technologies that failed to thrive, but succeeded in inspiring future innovation.
If you thought the wait for "Duke Nukem Forever" was bad, let’s just hope you haven’t been waiting for Project Xanadu. The brainchild of Ted Nelson, it was one of the first hypertext systems, and it’s an idea that’s been kicking around since at least 1960 – although it still hasn’t been officially released. Even so, early versions managed to attract some interest. Project Xanadu aimed to incorporate multiple revisions of a document that could be viewed indefinitely.
Nelson later gained fame with his classic book "Computer Lib/Dream Machines," published in 1974, a year before the first personal computers appeared. It was written in a nonlinear style, a kind of hypertext in a book.
Project Xanadu, on the other hand, never fully launched. However, one person influenced by the idea was Tim Berners-Lee, who, in the 90s, implemented a much simpler system called the World Wide Web. He was much less ambitious. He just wanted to create an online directory for CERN. And so, he did. (Learn more about it in The Pioneers of the World Wide Web.)
Nelson, however, doesn’t much care for the Web.
"Today’s popular software simulates paper," he wrote. "The World Wide Web (another imitation of paper) trivializes our original hypertext model with one-way ever-breaking links and no management of version or contents."
Wikis, on the other hand, bear some resemblance to the original Xanadu ideas. Most popular wikis, such as Wikipedia, even use version control to keep track of changes.
The Atari Mindlink was another piece of hardware that might have been way ahead of its time. Envisioned in 1983, it was essentially a headband (hey, this was the 80s!) that had embedded sensors that registered the movement of your forehead muscles. Wearing this contraption made it possible to control Atari’s computers and video game consoles – with your mind. It sounded futuristic at the time but ended up mostly giving headaches to the people who tested the prototype. Atari also had the bad timing of developing it in the midst of the Great Video Game Crash. The company was hemorrhaging money and ended up being sold to former Commodore head Jack Tramiel, who canceled many of Atari’s projects.
The idea of playing video games without your hands eventually resurfaced with the Xbox Kinect, a camera that attaches to the Xbox 360 console to track players’ movements. It’s also going to be built into the Xbox One console being released in November 2013.
Jack Tramiel’s old company, Commodore, had a cool piece of technology up its sleeve. Buying a small company called Amiga headed by some former Atari engineers (also the subject of a lawsuit between the two companies), Commodore released the computer of the same name in 1985. It had video and sound capabilities far beyond what was offered by most other companies and was quickly adopted by both gamers and creative types. But as cool as the Amiga was, it was still no match for the IBM/Microsoft juggernaut. As a result, it languished. Commodore’s poor marketing certainly didn’t help. The platform was popular in Europe, thanks to it being cheaper than PCs and Macs. The Video Toaster made it ubiquitous in TV production.
The Amiga featured custom video and sound chips, a very new idea at the time – although commonplace in today’s computers. The Amiga platform held on until 1994, when Commodore went bankrupt. Even today, it still has a strong cult following.
Developed jointly by IBM and Microsoft, this operating system was intended as a successor to MS-DOS, with features such as multitasking. It was released in 1987, but lacked the promised GUI, Presentation Manager. Differences between company cultures (particularly IBM’s system of measuring programmer productivity by lines of code) led to more friction between OS/2’s developers. So, following the success of Windows, Microsoft decided to break away from Big Blue. Windows NT became the company’s high-end offering for workstations and servers, and NT was merged into the consumer version of Windows in 2001 with XP. It continues to serve as the basis for modern Windows versions today. Although most PC users opted for Windows, OS/2 did find a niche in embedded systems, especially ATMs.
This might sound hypocritical coming from an enthusiastic Linux user, but with more than 20 years since the first Linux distributions appeared, it’s safe to say that Linux on the desktop will never overtake Windows in popularity. Sure, it’s popular on servers, but most people will not mess with the perfectly good OS that came with their computers.
On the other hand, Linux, in the form of Android, is very popular on mobile devices. So, while Linux may have failed to dominate the desktop market, it’s taking mobile by storm. Considering Android is the most popular mobile OS worldwide, it’s not a bad consolation prize.
Introduced by Iomega in the 1980s, Bernoulli drives combined the large storage space of hard drives with the portability of floppy disks, using a principle of physics known as Bernoulli’s principle. Unlike hard drives, it was impossible for a Bernoulli disk to suffer a head crash, making them very reliable. The drives and disks had a loyal following that appreciated their portability and reliability. Unfortunately, they were too expensive to a be viable option for consumers.
Nowadays, cheap flash drives allow people to take their files anywhere, and since they’re solid state, they’re pretty tough as well. Plus, even newer services like Google Drive and Dropbox allow people to dispense with the drives entirely and keep their files online.
Another piece of mind blowing technology introduced in the 70s, LaserDiscs offered video that was much better than the VHS tapes in use at the time. Plus, there were no video heads to wear out the tape. LaserDiscs also allowed for features, such as alternate audio and commentary tracks, pioneered by The Criterion Collection’s release of "King Kong" in 1984. Criterion introduced a number of other components that would later become standard on DVDs: the letterbox format, extra features – such as "making of" documentaries – and the concept of the "special edition." Other studios quickly copied these innovations, but LaserDiscs (LD) were expensive. The burgeoning movie rental stores opted for VHS instead. As a result, LaserDisc was relegated to deep pocketed film buffs.
The late 90s brought the DVD revolution, offering many of the LaserDisc’s innovations in a smaller and much cheaper package. The success of DVD led to the introduction of Blu-Ray. And Criterion? They’re still around, re-releasing critically acclaimed movies like "The Seven Samurai" and "The Seventh Seal" in modern formats. They’ve even branched out into streaming media, offering titles on Hulu Plus.
From Innovation to Inspiration
Some technologies fail, even when their creators do everything right. Sometimes, designs are just ahead of their time. Other times, they’re marketed poorly. And a few were just too ambitious to be fully realized. Even so, the best ones do serve to inspire future technologies – the kind that change the game. And those don’t come around very often.