Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Science

Ternary Computing 375

eviltwinimposter writes: "This month's American Scientist has an article about base-3 or ternary number systems, and their possible advantages for computing and other applications. Base-3 hardware could be smaller because of decreased number of components and use ternary logic to return less than, greater than, or equal, rather than just the binary true or false, although as the article says, '...you're not going to find a ternary minitower in stock at CompUSA.' Ternary also comes the closest of any integer base to e, the ideal base in terms of efficiency, and has some interesting properties such as unbounded square-free sequences. Also in other formats."
This discussion has been archived. No new comments can be posted.

Ternary Computing

Comments Filter:
  • by purduephotog ( 218304 ) <hirsch&inorbit,com> on Tuesday October 30, 2001 @02:41PM (#2498430) Homepage Journal
    ... the choices will be 0, 1, and Maybe :)

    Actually not a bad step- I wonder when they look at quantum computers using light ... this might be an easier step to integrate. There was a previous article here talking about light based quantum computing- give it a few years :)
  • well known (Score:2, Insightful)

    by 4im ( 181450 ) on Tuesday October 30, 2001 @02:44PM (#2498457)
    In theoretical CS classes, we learned all about it, it's not exactly news.
    The thing is, it's simpler to manufacture binary logic than ternary.
    So, no big deal really... the choices were made some time ago.
    Next step: quantum computing.
  • Not base3 again (Score:4, Insightful)

    by GunFodder ( 208805 ) on Tuesday October 30, 2001 @02:45PM (#2498463)
    Seems like someone has to bring up base3 computing every once in a while, just like asynchronous circuit design. I'm sure there are plenty of reasons why they are technically superior. But it has taken us 50+ years to get to this point with synchronous circuit design and binary logic. It would take many years to get to this point using totally new technology, and in the meantime the current computer industry would continue to grow exponentially. I'll believe in these technologies when I see a useful example of them.
  • by uslinux.net ( 152591 ) on Tuesday October 30, 2001 @03:34PM (#2498782) Homepage

    I vaguely remember discussing this in a Computer Science class on circuit design four or five years back. While this might be possible for some sort of non-copper processor, I imagine the difficulty would be in rapidly distinguishing correct voltages for each bit on today's technology.

    In simplistic terms, presently, if you have two bits, at a clock cycle, the electrical current is either 0 (0 volts) or 1 (3.3 volts). Theoretically, you could have an infinite number of bits, provided you had infinite voltage precision. Thus, 0=0v, 1=.1v, 2=.2v, ..., 33-3.3v - a 34-bit computer.

    However, your processor is probably designed with a tolerance in mind, thus 3.1 volts is probably a 1, and .2 volts is probably a 0. I really don't knwo the specs, but you might even presume 0-1.65v=0 and 1.66-3.3v=1. The amount of effort required and the reduction in speed required to slow the clock cycle down to ensure that .5v is ACTUALLY .5v and not .6v would probably impact performance too greatly.

    I'm sure there's a PhD EE somewhere in this crowd that can explain this even better, but my point is that I don't think anything but binary computers are useful with current electrical technology. Presently, there's a reason we use two bits - because it's easy and *fast* to check "on or off" without having to determine how "on" is "on". Now, if one was able to use fiber and send colors along with the pulses, then you might have something...
  • by Samuel Nitzberg ( 317670 ) on Tuesday October 30, 2001 @03:42PM (#2498836)
    I like what was written, and it is interesting, but I don't think that this will change much in terms of how computation is performed or perceived.

    One of the earlier posters had something mentioned about it all being two dimensional... actually, a good way to look at computation is using what Turing devised - a one dimensional model of computation based upon a single tape.

    In studying Turing Machines, the mathematical model based upon (potentially infinitely long) tapes is used extensively. Move the tape right, left, and modify what is under the head, for example, are the primitive operations. A set of functions defines how symbols are changed, and when computation halts, as well as the resulting halt state.

    A basic examination of binary versus ternary systems, based on Turing Machines, and some (basic) complexity Theory...

    In binary systems, computation trees build at the rate of 2^n, where n is the number of computational steps...

    In a trinary system, we are looking at 3^n.

    So, performance could be considered in terms of - I believe 3^n - 2^ n, i.e., polynomial, not exponential) differences in processing power.

    But, any binary system could by used to -simulate- a 3^n system through the use of a (at worst polynomially larger) set of functions and / or chunkings of data (to represent the 3 states in binary, repeatedly). Also, necessary encodings could be performed by 'chuncking' the ternary data into blocks.

    Polynomial gains are nice, but at best, we don't have an earth-shattering enhancement.

    P.S. Some of this may be a bit rusty, so if anyone has a more concrete analysis or corrections, feel free...

    Sam Nitzberg
    sam@iamsam.com
    http://www.iamsam.com
  • by purduephotog ( 218304 ) <hirsch&inorbit,com> on Tuesday October 30, 2001 @03:53PM (#2498911) Homepage Journal
    Light has infinite wavelengths (not in reality as only certain wavelenghts are emitted, but you can combo those with different techniques to get infinite). I'm sorry you don't undertand much about quantumn computers constructed with light- i suggest reading up on it.

    Since you have an infinite number of selections to choose from, and as was demonstrated that base E is the most efficient to represent numbers in (ie, infinite representation in base e is better than other bases), then it stands to reason that quantumn computers based on light should be designed to utilize base e, but since that isn't very practical ternary might be the first logical step.

    And howcome I got rated offtopic? Quantumn computing is the logical next application of ternary computing, since binary is pretty much entrenched in everything from your local computre reseller to every time you toss a dime for 'heads or tails'.
  • by afs ( 107871 ) on Tuesday October 30, 2001 @04:01PM (#2498969)

    On the contrary--the "theoretical math" was never developed for a specific representation of information, much less binary. In fact, information theory accomodates any representation, all the way back to Shannon [bell-labs.com]..

    The real difficulty is physical implementation. Coming up with coding schemes is trivial.

  • Re:Not base3 again (Score:3, Insightful)

    by Tom7 ( 102298 ) on Tuesday October 30, 2001 @04:02PM (#2498970) Homepage Journal

    The reason that the computer industry grows exponentially is exactly these kinds of paradigm-changing technologies.Most of these have happened in manufacturing processes, but I think as we exhaust that field we will be pushing the changes higher up the architecture. (x86, your days are numbered!)

    That said, base 3 is probably pretty stupid. Asynchronous circuits, however, might really make a difference some day...
  • Why this is useful (Score:3, Insightful)

    by ChenLing ( 20932 ) <slashdot@@@ilovedancing...org> on Tuesday October 30, 2001 @04:43PM (#2499189) Homepage
    I've read a lot of posts on how this will be difficult to implement using voltages and circuits....and you know what? It *IS* difficult to sense 3 different voltage.
    The solution? Don't use electric circuits...don't use transistors.

    Electric circuits will only get us so far, and then we'll have to move on to more 'exotic' hardware -- optical computing, molecular computing, quantum computing.......

    Suppose a qubit's state is describe by the spin polarization of an electron pair -- they can either be both up, both down, or one of each -- you can't tell which one, so it's actually 3 states (balanced at that)......

    In optical computing, suppose you can push the frequency of the lasers a little in either direction of 'neutral'...this is also base 3.

    So what I'm trying to say is, don't just say "base-3 computing is not practical with current technology" -- because it isn't, but it WILL be practical (perhaps even more so than binary computing) with future technology.

    And to finish with something lighter...
    troolean x, y, z;
    x = true;
    y = false;
    z = existentialism;

    :)
  • by nathanh ( 1214 ) on Tuesday October 30, 2001 @05:34PM (#2499488) Homepage
    Tertary would put us into "middle" voltage. But middle on the input, creates middle on the output, no direct way to get either high or low - making basic circuits more complex. But the real killer with "middle" is manufacturing. Let's say we use 2.8 Volts for the high level, 0.2 Volts for the low level. Due to manufacturing tolerances some chips transistors would be "fully" open at 2.3 Volts, others at 2.7 Volts. Easy to compensate on binary designs, you just use the 2.8 to switch the transistor, but for the middle level? What's required to switch a transistor to middle on one chip, is sufficient to open the transistor completely on another chip...

    Why would you choose such a brain dead scheme? 2.8V as your "middle" choice? A sensible scheme would have been +ve rail, -ve rail, and ground. This builds upon 100 years of analog electronics and op-amps. Locking a voltage to a rail is extremely easy AND fast.

    Plus disadvantages in power consumption and and and...

    The benefit of a ternary scheme is that you have LESS power consumption to achieve the same work. Your individual flip-flap-flops are more complex than a binary flip-flop, but you need fewer flip-flap-flops. Overall you'll have fewer transistors and subsequently less heat than the equivalent binary circuit.

    So your manufacturing tolerances become way smaller, and that of course reduces yield which increases cost.

    The fact that fewer transistors are required to achieve the same work (despite the fact that there are more transistors per gate) will INCREASE the yields. This DECREASES costs.

    How in hell did your post get modded up?

  • by Anonymous Coward on Tuesday October 30, 2001 @06:30PM (#2499830)
    Perhaps it isn't best for electrical systems, but it should work great for optical. Think RGB -> Red, Green, Blue. It only makes sense for optical.
  • by Petrus ( 17053 ) on Tuesday October 30, 2001 @06:41PM (#2499920)
    The hypoteticaly "cheap" ternary system assumess, that the need for hardware scales linearly with the base. That is, e.g. it binary gate requires 2 transistors, ternary needs 3 transistors. In such a case 2^3=8 is less than 3^2=9.
    So system with 2*3=6 transistors can count to 9 in ternary while in binary only to 8. When searching maximum for f(x) = x^(const/x), one ends up with e for all const>1. That's why the mention that the 3 is the closest to e - an number base ideal. I remember having that case in mathematics competition way back in 8th grade.

    In engineering practice, that is quite far from truth. In ECL logic the ternary half-adder requires the same number of transistors as in binary logic. It requires the same number of wires to carry ternary digit as binary one. However we all know why is ECL nearly extinct - its high consumption prevents high integration.

    The benefits of binary logic can be seen in CMOS, where we have two states, each of which consumes also no power and still has low output impedance.

    Petrus

If all else fails, lower your standards.

Working...