Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Science

Ternary Computing 375

eviltwinimposter writes: "This month's American Scientist has an article about base-3 or ternary number systems, and their possible advantages for computing and other applications. Base-3 hardware could be smaller because of decreased number of components and use ternary logic to return less than, greater than, or equal, rather than just the binary true or false, although as the article says, '...you're not going to find a ternary minitower in stock at CompUSA.' Ternary also comes the closest of any integer base to e, the ideal base in terms of efficiency, and has some interesting properties such as unbounded square-free sequences. Also in other formats."
This discussion has been archived. No new comments can be posted.

Ternary Computing

Comments Filter:
  • by ponos ( 122721 ) on Tuesday October 30, 2001 @02:45PM (#2498465)
    Try reading Knuth's The Art of Computer Programming, Vol. 2, Section 4.1, Positional
    Number Systems.

    There is an extended discussion on the balanced
    ternary system and some other exotic number
    systems (base 2i etc). There are some merits
    to the ternary system but it would be
    harder to implement with transistors.
  • by m_ilya ( 311437 ) <ilya@martynov.org> on Tuesday October 30, 2001 @02:56PM (#2498547) Homepage
    I have seen in one book that there was created a ternary computer long time ago. I have tried to find anything with google and found this page [computer-museum.ru].
  • by Sebastopol ( 189276 ) on Tuesday October 30, 2001 @02:58PM (#2498567) Homepage
    Close, but you are still doing digital computing! Just because it's not binary doesn't mean it isn't digital.

    The problem is understanding the new metaphors required to implement new modes of math. Simply adding a third state doesn't get you a revolutionalry new mode of computation, it just gets you more bits per wire. For example, look at flash technology: they now store multiple bits per cell by designing sense amps to convert the analog level to a binary pattern.

    Read the book "An Introduction to Quantum Computing". I forget the author, but it's the one that comes with the CD of mathematica examples.

    In this book they discuss a simple adder that Feynman derived. The realization of the Hamiltonion operator (similar to the transfer function H(s)) requires a gate called:

    Square root of NOT!

    It's pretty crazy, but when you walk through the example step-by-step, it becomes more clear why it is needed to build the simple adder.

    Now how you actually build a root-not gate is another problem, but I'm just making this point to illustrate how "meta" the new concepts have to be to truly revolutionize computation.

    There's simply nothing better than binary right now.
  • Re:Trits? (Score:2, Informative)

    by kerrbear ( 163235 ) on Tuesday October 30, 2001 @03:13PM (#2498642)
    Setun operated on numbers composed of 18 ternary digits, or trits

    Awww...they shied away from the obvious choice, tits.

    Just to be more serious and perferctionistic about it. Shouldn't the word digit in this case be a trigit? Since the very word digit is prefaced with di which means two? I guess I could be wrong about that, but it seems to make sense.

  • by Uttles ( 324447 ) <[moc.liamg] [ta] [selttu]> on Tuesday October 30, 2001 @03:16PM (#2498659) Homepage Journal
    I looked over the article and it made a good arument for a ternary computing architecture, however there are some big problems with this that were not addressed in the article. Although I'm not a math expert, I did gain a math minor in college during my computer engineering curriculum, and I have to say ternary computing seems to have too many complex problems that need solving to be worth it.

    First of all, hardware is getting smaller and smaller all the time, so the whole premise behind ternary computing (base 3 would use less hardware) doesn't apply, especially since brand new gates would have to be made in order to distinguish between 3 signal levels rather than 2, and that would be taking a HUGE step backwards.

    Secondly, doing things on a chip or two is great, but the main problem in computing is communications. The major part of creating efficient communications protocols is determining the probability of a bit error. Probability is a very complicated science, even using the binary distribution, which is a very simple function (that just happens to escape me at the moment.) Now, add another bit, and you have to use a trinary distribution, which I'm sure exists but isn't very common (and not surprisingly, I can't recall that one either). Long story short, this theoretical math has been made practical in computer communications over a long period of time dating back 50 years, starting all over with 3 bits rather than 2 would be extremely complicated and VERY, VERY expensive.
    Finally, figuring out logical schemes for advanced, specialized chips is a daunting task. Engineers have come up with shortcuts over the years (K-maps, state diagrams, special algorithms, etc) but adding in a 3rd state to each input would make things almost impossibly complicated. All computer engineers working at the hardware level would have to be re-educated, starting with the simplest of logical gates.

    Overall, in my humble opinion, we'll never see large scale use of ternary computing. There's just too much overhead involved in switching over the way of doing things at such a fundamental level. The way hardware advances each year, things are getting smaller and smaller without switching the number base, so until we reach the limit using binary, we'll probably stick with it.
  • Re:Trits? (Score:2, Informative)

    by rgmoore ( 133276 ) <glandauer@charter.net> on Tuesday October 30, 2001 @03:23PM (#2498709) Homepage

    IIRC, the origin of digit is not from di- meaning two, but from digit meaning finger or toe. This makes some sense if you think about where numbering systems came from. FWIW, one advantage of binary is that it's very easy to count in binary on your fingers; your thumb is the ones bit, index finger twos bit, middle finger fours bit, etc. Not quite as easy to do in ternary.

  • by nadador ( 3747 ) on Tuesday October 30, 2001 @03:33PM (#2498766)
    and rain on the computer scientist's parade, but...

    The reason that you can't get, and won't for a long time, anything greater than base 2 is that setting and sensing more than two logical levels in a given voltage range is very hard. Those ones and zeros you like to look at and think about discretely are not really ones and zeros, but voltages close to those that represent one and zero, close enough to not confuse the physics of the device in question.

    For example, if you arbitrarily define 0 volts to be a 0 and 1 volt to be 1 in an equally useless and arbitrary circuit, and you monitor the voltage, what do you assume is happening if one of your discrete samples is .5? Is it your ternary maybe, or is it the circuit switching from 0 to 1? And what about the case when your manufacturing process introduces errors greater than you expected? What if 1 comes out .75? Is that in the maybe range or the 1 range?

    Now, I remember something about double flash memory densities by sensing 4 voltage ranges in each cell, but I imagine the timing precision required to do that correctly is orders of magnitude easier to do (and still a royal pain) than putting ternary logic into a modern microprocessor (with tens of millions of transistors, implementing everything created in the entire history of computing that might be even marginally useful so that you can 3 more frames per second in quake3).

  • by Asic Eng ( 193332 ) on Tuesday October 30, 2001 @03:42PM (#2498839)
    but it would be harder to implement with transistors.

    Very apt. A binary transistor has two states, idealized "on" and "off". From a more analog view that's low current and high current - appropriately connected with a resistor that results in low and high voltages.

    The nice feature is, that a high voltage at the input opens the transistor, a low voltage closes it. So we get a relatively complete system, I can get from hi to lo, from lo to hi.

    Tertary would put us into "middle" voltage. But middle on the input, creates middle on the output, no direct way to get either high or low - making basic circuits more complex.

    But the real killer with "middle" is manufacturing. Let's say we use 2.8 Volts for the high level, 0.2 Volts for the low level. Due to manufacturing tolerances some chips transistors would be "fully" open at 2.3 Volts, others at 2.7 Volts. Easy to compensate on binary designs, you just use the 2.8 to switch the transistor, but for the middle level? What's required to switch a transistor to middle on one chip, is sufficient to open the transistor completely on another chip...

    So your manufacturing tolerances become way smaller, and that of course reduces yield which increases cost.

    Add to that, that chips today work with a variety of "hi" voltages like 5, 3.3, 2.8 ... Most lower-voltage chips are compatible with higher-voltage ones, they produce voltages which are still over the switching point and accept higher voltages than they operate on.

    With ternary that becomes impossible and chip manufacturers need to progressively lower the voltages for higher speed.

    Plus disadvantages in power consumption and and and...

    Admittedly the article doesn't seem to suggest that ternary is viable, just that it's pretty. Which may be true for a mathematician. :)

  • Been done (Score:2, Informative)

    by apilosov ( 1810 ) on Tuesday October 30, 2001 @03:54PM (#2498916) Homepage
    As a matter of fact, there was a ternary computer built in Russia, called Setun' (apostrophe signifies a soft n).

    Russian translation of Knuth's Volume 2 was quite funny. Knuth is saying that "Somewhere, some strange person might actually build a ternary computer". The russian translation had a translators footnote "Actually, it has been built in russia in 1960s".

    See this page for more information about setun:
    http://www.computer-museum.ru/english/setun.htm
  • by Anonymous Coward on Tuesday October 30, 2001 @03:58PM (#2498939)
    Base 2 is very convenient for digital circuits because it relies on the inherent non-linear regions of the transistors for representing values while saving power. In a typical CMOS inverter, you have two transistors - a PMOS over an NMOS - with a tied gate. One transistor is always in the on state (saturation), the other is in the off state (cutoff). Therefore, whether the gate presents a logic 1 or logic 0, you have no power consumption because it's either a gate looking at a ground (no voltage), or a voltage rail looking at an insulating gate (voltage but very very high resistance). In this way, almost no power is consumed in the static state with the exception of some leakage currents which are for now manageable (but getting much worse in smaller geometry technologies).

    When switching, however, the transistors both go into the linear conduction region, which is why they consume power - there is a resistive path between the supply and ground. This is the region used for amplification of sound and for other electronic circuits. But it consumes a lot of power.

    How you could even construct a trinary circuit which has the same power characteristics as complimentary MOS is highly problematic at best. The number one and two problems on chips are timing closure and power, respectively. IC packages can't even handle the power. The more time the circuit spends in the linear region, the worse the power consumption. To me, any potential savings in size for such a circuit are virtually impossible to fathom. Not to mention some insurmountable difficulties as the author of the parent article mentioned on design methodology and tools to support a trinary system.

    HOWEVER, it should be noted that signaling via pseudo-trinary methods is possible to alleviate timing problems at the expense of absolute performance. Motorola has an embeddable core which uses pairs of wires that have four states - 1, 0, ack, and a null state which is not valid anyway. But even this is not power efficient or fast - it's just easier to implement in some ways.
  • by nels_tomlinson ( 106413 ) on Tuesday October 30, 2001 @04:54PM (#2499235) Homepage
    Now, add another bit, and you have to use a trinary distribution, which I'm sure exists but isn't very common (and not surprisingly, I can't recall that one either).

    Well, I don't think that the probability is really much worse. Instead of binomial, we have in general multinomial, and here trinomial: pdf=(n!/(x_i!*x_j!*x_k!))(p_i^{x_i}*p_j^{x_j)*p_k^ {x_k)).
    See Berger's Statistical Decision Theory and Bayesian Analysis. Or here [home.cern.ch] or here [uah.edu].

    There are some hardware problems; I posted a possible solution . [slashdot.org] (It's a joke, mostly!)

    A more serious problem is mentioned by anohter poster: floating point [sun.com] is where we really, really care about speed and efficiency, and it seems that binary has that sewn up.

    ... we'll never see large scale use of ternary computing. There's just too much overhead involved in switching over the way of doing things at such a fundamental level.

    Quite right. This is the only argument against it which doesn't have an answer, I suspect.
  • by spiro_killglance ( 121572 ) on Tuesday October 30, 2001 @06:53PM (#2500004) Homepage
    Actually apart from the colors, all the other
    particles are come in (isospin) pairs.

    Under SU(2) (weak force pairs)

    Electron Neutrino
    up down
    strange charmed
    bottom top
    proton neutron (which is up down again)

    blue red green because color has
    SU(3) symmetry

You knew the job was dangerous when you took it, Fred. -- Superchicken

Working...