Fry: “Bender, what is it?”

Bender: “Ahhh, what an awful dream. Ones and zeros everywhere... [shudder] and I thought I saw a two!”

Fry: “It was just a dream, Bender. There's no such thing as two.”

Anyone familiar with digital computing knows about zeros and ones – including characters in the “Futurama” cartoon. Zeros and ones are the building blocks of binary language. But not all computers are digital, and nothing says that digital computers have to be binary. What if we used a base-3 system instead of base-2? Could a computer conceive of a third digit?

As computer science essayist Brian Hayes noted, “People count by tens and machines count by twos.” A few brave souls have dared to consider a ternary alternative. Louis Howell proposed the programming language TriINTERCAL using the base-3 numbering system in 1991. And Russian innovators built a few dozen base-3 machines over 50 years ago. But for some reason, the numbering system didn’t catch on in the broader computer world.

A Look at the Math

Given the limited space here, we’ll just touch on a few mathematical ideas to give us some background. For a more in-depth understanding of the subject, have a look at Hayes’ excellent article “Third Base” in the Nov/Dec 2001 issue of American Scientist.

Now let’s look at the terms. You probably picked up by now (if you didn’t already know) that the word “ternary” has to do with the number three. Generally, something that is ternary is composed of three parts or divisions. A ternary form in music is a song form made up of three sections. In mathematics, ternary means using three as a base. Some people prefer the word trinary, perhaps because it rhymes with binary.

Jeff Connelly covers a few more terms in his 2008 paper “Ternary Computing Testbed 3-Trit Computer Architecture.” A “trit” is the ternary equivalent of a bit. If a bit is a binary digit that can have one of two values, then a trit is a ternary digit that can have any of three values. A trit is one base-3 digit. A “tryte” would be 6 trits. Connelly (and perhaps no one else) defines a “tribble” as half a trit (or one base-27 digit) and he calls one base-9 digit a “nit.” (For more on data measurement, see Understanding Bits, Bytes and Their Multiples.)

It can all become a bit overwhelming for mathematical laymen (like myself), so we’ll just look at another concept to help us get a grasp of the numbers. Ternary computing deals with three discrete states, but the ternary digits themselves can be defined in different ways, according to Connelly:

  • Unbalanced Trinary – {0,1,2}
  • Fractional Unbalanced Trinary – {0,1/2,1}
  • Balanced Trinary – {-1,0,1}
  • Unknown-State Logic – {F,?,T}
  • Trinary Coded Binary – {T,F,T}

Ternary Computers in History

There’s not much to cover here because, as Connelly put it, “Trinary technology is relatively unexplored territory in the computer architecture field.” While there may be a hidden treasure of university research on the subject, not many base-3 computers have made it into production. At the 2016 Hackaday Superconference, Jessica Tank gave a talk on the ternary computer that she’s been working on for the past few years. Whether her efforts will rise from obscurity remains to be seen.

But we’ll find a bit more if we look back to Russia in the mid-20th century. The computer was called SETUN, and the engineer was Nikolay Petrovich Brusentsov (1925–2014). Working with the notable Soviet mathematician Sergei Lvovich Sobolev, Brusentsov created a research team at Moscow State University and designed a ternary computer architecture that would result in the construction of 50 machines. As researcher Earl T. Campbell states on his website, SETUN “was always a university project, not fully endorsed by the Soviet government, and viewed suspiciously by factory management.”

The Case for Ternary

SETUN used balanced ternary logic, {-1,0,1} as noted above. That’s the common approach to ternary, and it’s also found in the work of Jeff Connelly and Jessica Tank. “Perhaps the prettiest number system of all is the balanced ternary notation,” writes Donald Knuth in an excerpt from his book “The Art of Computer Programming.”

Brian Hayes is also a big fan of ternary. “Here I want to offer three cheers for base 3, the ternary system. … They are the Goldilocks choice among numbering systems: When base 2 is too small and base 10 is too big, base 3 is just right.”

One of Hayes’ arguments for the virtues of base-3 is that it is the closest numbering system to base-e, “the base of the natural logarithms, with a numerical value of about 2.718.” With mathematical prowess, the essayist Hayes explains how base-e (if it were practical) would be the most economical numbering system. It is ubiquitous in nature. And I clearly remember these words from Mr. Robertson, my high school chemistry teacher: “God counts by e.”

The greater efficiency of ternary in comparison to binary can be illustrated by the use of the SETUN computer. Hayes writes: “Setun operated on numbers composed of 18 ternary digits, or trits, giving the machine a numerical range of 387,420,489. A binary computer would need 29 bits to reach this capacity ….”

So Why Not Ternary?

Now we return to the original question of the article. If ternary computing is so much more efficient, why aren’t we all using them? One answer is that things just didn’t happen that way. We have come so far in binary digital computing that it would be pretty hard to turn back. Just as the robot Bender has no idea how to count beyond zero and one, today’s computers operate on a logic system that's different from what any potential ternary computer would use. Of course, Bender could somehow be made to understand ternary – but it would probably be more like a simulation than a redesign.

And SETUN itself did not realize the greater efficiency of ternary, according to Hayes. He says that because each trit was stored in a pair of magnetic cores “the ternary advantage was squandered.” It seems that the implementation is just as important as the theory.

An extended quote from Hayes seems appropriate here:

“Why did base 3 fail to catch on? One easy guess is that reliable three-state devices just didn't exist or were too hard to develop. And once binary technology became established, the tremendous investment in methods for fabricating binary chips would have overwhelmed any small theoretical advantage of other bases.”

The Numbering System of the Future

We’ve talked about bits and trits, but have you heard of qubits? That’s the proposed unit of measurement for quantum computing. The math gets a little fuzzy here. A quantum bit, or qubit, is the smallest unit of quantum information. A qubit can exist in multiple states at once. So while it can represent more than just the two states of binary, it isn't quite the same as ternary. (To learn more about quantum computing, see Why Quantum Computing May Be the Next Turn on the Big Data Highway.)

And you thought binary and ternary were hard. Quantum physics is not intuitively obvious. The Austrian physicist Erwin Schrödinger offered a thought experiment, famously known as Schrödinger's cat. You are asked to suppose for a minute a scenario where the cat is both alive and dead simultaneously.

This is where some people get off the bus. It’s ridiculous to propose that a cat could be both alive and dead, but that’s the essence of quantum superposition. The crux of quantum mechanics is that objects have characteristics of both waves and particles. Computer scientists are working to take advantage of these properties.

The superposition of qubits opens a new world of possibilities. Quantum computers are expected to be exponentially faster than binary or ternary computers. The parallelism of multiple qubit states could make a quantum computer millions of times faster than today’s PC.

Conclusion

Until the day the quantum computing revolution changes everything, the status quo of binary computing will remain. When Jessica Tank was asked what use cases might arise for ternary computing, the audience groaned upon hearing a reference to “the internet of things.” And that may be the crux of the matter. Unless the computing community agrees on a very good reason to upset the apple cart and asks their computers to count in threes instead of twos, robots like Bender will continue to think and dream in binary. Meanwhile, the age of quantum computing is just beyond the horizon.