Ternary Computing 375
eviltwinimposter writes: "This month's American Scientist has an article about base-3 or ternary number systems, and their possible advantages for computing and other applications. Base-3 hardware could be smaller because of decreased number of components and use ternary logic to return less than, greater than, or equal, rather than just the binary true or false, although as the article says, '...you're not going to find a ternary minitower in stock at CompUSA.' Ternary also comes the closest of any integer base to e, the ideal base in terms of efficiency, and has some interesting properties such as unbounded square-free sequences. Also in other formats."
Ternary has been known to be efficient... (Score:5, Informative)
Number Systems.
There is an extended discussion on the balanced
ternary system and some other exotic number
systems (base 2i etc). There are some merits
to the ternary system but it would be
harder to implement with transistors.
SETUN - Russian ternary computer (Score:5, Informative)
Re:Nondigital computing: Root Not (Score:4, Informative)
The problem is understanding the new metaphors required to implement new modes of math. Simply adding a third state doesn't get you a revolutionalry new mode of computation, it just gets you more bits per wire. For example, look at flash technology: they now store multiple bits per cell by designing sense amps to convert the analog level to a binary pattern.
Read the book "An Introduction to Quantum Computing". I forget the author, but it's the one that comes with the CD of mathematica examples.
In this book they discuss a simple adder that Feynman derived. The realization of the Hamiltonion operator (similar to the transfer function H(s)) requires a gate called:
Square root of NOT!
It's pretty crazy, but when you walk through the example step-by-step, it becomes more clear why it is needed to build the simple adder.
Now how you actually build a root-not gate is another problem, but I'm just making this point to illustrate how "meta" the new concepts have to be to truly revolutionize computation.
There's simply nothing better than binary right now.
Re:Trits? (Score:2, Informative)
Awww...they shied away from the obvious choice, tits.
Just to be more serious and perferctionistic about it. Shouldn't the word digit in this case be a trigit? Since the very word digit is prefaced with di which means two? I guess I could be wrong about that, but it seems to make sense.
Fascinating, but not practical, here's why: (Score:5, Informative)
First of all, hardware is getting smaller and smaller all the time, so the whole premise behind ternary computing (base 3 would use less hardware) doesn't apply, especially since brand new gates would have to be made in order to distinguish between 3 signal levels rather than 2, and that would be taking a HUGE step backwards.
Secondly, doing things on a chip or two is great, but the main problem in computing is communications. The major part of creating efficient communications protocols is determining the probability of a bit error. Probability is a very complicated science, even using the binary distribution, which is a very simple function (that just happens to escape me at the moment.) Now, add another bit, and you have to use a trinary distribution, which I'm sure exists but isn't very common (and not surprisingly, I can't recall that one either). Long story short, this theoretical math has been made practical in computer communications over a long period of time dating back 50 years, starting all over with 3 bits rather than 2 would be extremely complicated and VERY, VERY expensive.
Finally, figuring out logical schemes for advanced, specialized chips is a daunting task. Engineers have come up with shortcuts over the years (K-maps, state diagrams, special algorithms, etc) but adding in a 3rd state to each input would make things almost impossibly complicated. All computer engineers working at the hardware level would have to be re-educated, starting with the simplest of logical gates.
Overall, in my humble opinion, we'll never see large scale use of ternary computing. There's just too much overhead involved in switching over the way of doing things at such a fundamental level. The way hardware advances each year, things are getting smaller and smaller without switching the number base, so until we reach the limit using binary, we'll probably stick with it.
Re:Trits? (Score:2, Informative)
IIRC, the origin of digit is not from di- meaning two, but from digit meaning finger or toe. This makes some sense if you think about where numbering systems came from. FWIW, one advantage of binary is that it's very easy to count in binary on your fingers; your thumb is the ones bit, index finger twos bit, middle finger fours bit, etc. Not quite as easy to do in ternary.
not to be an engineer... (Score:4, Informative)
The reason that you can't get, and won't for a long time, anything greater than base 2 is that setting and sensing more than two logical levels in a given voltage range is very hard. Those ones and zeros you like to look at and think about discretely are not really ones and zeros, but voltages close to those that represent one and zero, close enough to not confuse the physics of the device in question.
For example, if you arbitrarily define 0 volts to be a 0 and 1 volt to be 1 in an equally useless and arbitrary circuit, and you monitor the voltage, what do you assume is happening if one of your discrete samples is
Now, I remember something about double flash memory densities by sensing 4 voltage ranges in each cell, but I imagine the timing precision required to do that correctly is orders of magnitude easier to do (and still a royal pain) than putting ternary logic into a modern microprocessor (with tens of millions of transistors, implementing everything created in the entire history of computing that might be even marginally useful so that you can 3 more frames per second in quake3).
Re:Ternary has been known to be efficient... (Score:5, Informative)
Very apt. A binary transistor has two states, idealized "on" and "off". From a more analog view that's low current and high current - appropriately connected with a resistor that results in low and high voltages.
The nice feature is, that a high voltage at the input opens the transistor, a low voltage closes it. So we get a relatively complete system, I can get from hi to lo, from lo to hi.
Tertary would put us into "middle" voltage. But middle on the input, creates middle on the output, no direct way to get either high or low - making basic circuits more complex.
But the real killer with "middle" is manufacturing. Let's say we use 2.8 Volts for the high level, 0.2 Volts for the low level. Due to manufacturing tolerances some chips transistors would be "fully" open at 2.3 Volts, others at 2.7 Volts. Easy to compensate on binary designs, you just use the 2.8 to switch the transistor, but for the middle level? What's required to switch a transistor to middle on one chip, is sufficient to open the transistor completely on another chip...
So your manufacturing tolerances become way smaller, and that of course reduces yield which increases cost.
Add to that, that chips today work with a variety of "hi" voltages like 5, 3.3, 2.8 ...
Most lower-voltage chips are compatible with higher-voltage ones, they produce voltages which
are still over the switching point and accept higher voltages than they operate on.
With ternary that becomes impossible and chip manufacturers need to progressively lower the voltages for higher speed.
Plus disadvantages in power consumption and and and...
Admittedly the article doesn't seem to suggest that ternary is viable, just that it's pretty. Which may be true for a mathematician. :)
Been done (Score:2, Informative)
Russian translation of Knuth's Volume 2 was quite funny. Knuth is saying that "Somewhere, some strange person might actually build a ternary computer". The russian translation had a translators footnote "Actually, it has been built in russia in 1960s".
See this page for more information about setun:
http://www.computer-museum.ru/english/setun.htm
Also, transistors cannot be made for base 3 (Score:1, Informative)
When switching, however, the transistors both go into the linear conduction region, which is why they consume power - there is a resistive path between the supply and ground. This is the region used for amplification of sound and for other electronic circuits. But it consumes a lot of power.
How you could even construct a trinary circuit which has the same power characteristics as complimentary MOS is highly problematic at best. The number one and two problems on chips are timing closure and power, respectively. IC packages can't even handle the power. The more time the circuit spends in the linear region, the worse the power consumption. To me, any potential savings in size for such a circuit are virtually impossible to fathom. Not to mention some insurmountable difficulties as the author of the parent article mentioned on design methodology and tools to support a trinary system.
HOWEVER, it should be noted that signaling via pseudo-trinary methods is possible to alleviate timing problems at the expense of absolute performance. Motorola has an embeddable core which uses pairs of wires that have four states - 1, 0, ack, and a null state which is not valid anyway. But even this is not power efficient or fast - it's just easier to implement in some ways.
Re:Fascinating, but not practical, here's why: (Score:3, Informative)
Well, I don't think that the probability is really much worse. Instead of binomial, we have in general multinomial, and here trinomial: pdf=(n!/(x_i!*x_j!*x_k!))(p_i^{x_i}*p_j^{x_j)*p_k
See Berger's Statistical Decision Theory and Bayesian Analysis. Or here [home.cern.ch] or here [uah.edu].
There are some hardware problems; I posted a possible solution
A more serious problem is mentioned by anohter poster: floating point [sun.com] is where we really, really care about speed and efficiency, and it seems that binary has that sewn up.
Quite right. This is the only argument against it which doesn't have an answer, I suspect.
Re:Nondigital computing (Score:3, Informative)
particles are come in (isospin) pairs.
Under SU(2) (weak force pairs)
Electron Neutrino
up down
strange charmed
bottom top
proton neutron (which is up down again)
blue red green because color has
SU(3) symmetry