Ternary Computing 375
eviltwinimposter writes: "This month's American Scientist has an article about base-3 or ternary number systems, and their possible advantages for computing and other applications. Base-3 hardware could be smaller because of decreased number of components and use ternary logic to return less than, greater than, or equal, rather than just the binary true or false, although as the article says, '...you're not going to find a ternary minitower in stock at CompUSA.' Ternary also comes the closest of any integer base to e, the ideal base in terms of efficiency, and has some interesting properties such as unbounded square-free sequences. Also in other formats."
The future holds that... (Score:3, Insightful)
Actually not a bad step- I wonder when they look at quantum computers using light
well known (Score:2, Insightful)
The thing is, it's simpler to manufacture binary logic than ternary.
So, no big deal really... the choices were made some time ago.
Next step: quantum computing.
Not base3 again (Score:4, Insightful)
Difficulties in Implementation (Score:2, Insightful)
I vaguely remember discussing this in a Computer Science class on circuit design four or five years back. While this might be possible for some sort of non-copper processor, I imagine the difficulty would be in rapidly distinguishing correct voltages for each bit on today's technology.
In simplistic terms, presently, if you have two bits, at a clock cycle, the electrical current is either 0 (0 volts) or 1 (3.3 volts). Theoretically, you could have an infinite number of bits, provided you had infinite voltage precision. Thus, 0=0v, 1=.1v, 2=.2v,
However, your processor is probably designed with a tolerance in mind, thus 3.1 volts is probably a 1, and
I'm sure there's a PhD EE somewhere in this crowd that can explain this even better, but my point is that I don't think anything but binary computers are useful with current electrical technology. Presently, there's a reason we use two bits - because it's easy and *fast* to check "on or off" without having to determine how "on" is "on". Now, if one was able to use fiber and send colors along with the pulses, then you might have something...
Turing Theory... Complexity Analysis .. blah blah. (Score:2, Insightful)
One of the earlier posters had something mentioned about it all being two dimensional... actually, a good way to look at computation is using what Turing devised - a one dimensional model of computation based upon a single tape.
In studying Turing Machines, the mathematical model based upon (potentially infinitely long) tapes is used extensively. Move the tape right, left, and modify what is under the head, for example, are the primitive operations. A set of functions defines how symbols are changed, and when computation halts, as well as the resulting halt state.
A basic examination of binary versus ternary systems, based on Turing Machines, and some (basic) complexity Theory...
In binary systems, computation trees build at the rate of 2^n, where n is the number of computational steps...
In a trinary system, we are looking at 3^n.
So, performance could be considered in terms of - I believe 3^n - 2^ n, i.e., polynomial, not exponential) differences in processing power.
But, any binary system could by used to -simulate- a 3^n system through the use of a (at worst polynomially larger) set of functions and / or chunkings of data (to represent the 3 states in binary, repeatedly). Also, necessary encodings could be performed by 'chuncking' the ternary data into blocks.
Polynomial gains are nice, but at best, we don't have an earth-shattering enhancement.
P.S. Some of this may be a bit rusty, so if anyone has a more concrete analysis or corrections, feel free...
Sam Nitzberg
sam@iamsam.com
http://www.iamsam.com
Re:The future holds that... (Score:3, Insightful)
Since you have an infinite number of selections to choose from, and as was demonstrated that base E is the most efficient to represent numbers in (ie, infinite representation in base e is better than other bases), then it stands to reason that quantumn computers based on light should be designed to utilize base e, but since that isn't very practical ternary might be the first logical step.
And howcome I got rated offtopic? Quantumn computing is the logical next application of ternary computing, since binary is pretty much entrenched in everything from your local computre reseller to every time you toss a dime for 'heads or tails'.
Re:Fascinating, but not practical, here's why: (Score:2, Insightful)
On the contrary--the "theoretical math" was never developed for a specific representation of information, much less binary. In fact, information theory accomodates any representation, all the way back to Shannon [bell-labs.com]..
The real difficulty is physical implementation. Coming up with coding schemes is trivial.
Re:Not base3 again (Score:3, Insightful)
The reason that the computer industry grows exponentially is exactly these kinds of paradigm-changing technologies.Most of these have happened in manufacturing processes, but I think as we exhaust that field we will be pushing the changes higher up the architecture. (x86, your days are numbered!)
That said, base 3 is probably pretty stupid. Asynchronous circuits, however, might really make a difference some day...
Why this is useful (Score:3, Insightful)
The solution? Don't use electric circuits...don't use transistors.
Electric circuits will only get us so far, and then we'll have to move on to more 'exotic' hardware -- optical computing, molecular computing, quantum computing.......
Suppose a qubit's state is describe by the spin polarization of an electron pair -- they can either be both up, both down, or one of each -- you can't tell which one, so it's actually 3 states (balanced at that)......
In optical computing, suppose you can push the frequency of the lasers a little in either direction of 'neutral'...this is also base 3.
So what I'm trying to say is, don't just say "base-3 computing is not practical with current technology" -- because it isn't, but it WILL be practical (perhaps even more so than binary computing) with future technology.
And to finish with something lighter...
troolean x, y, z;
x = true;
y = false;
z = existentialism;
:)
Re:Ternary has been known to be efficient... (Score:3, Insightful)
Why would you choose such a brain dead scheme? 2.8V as your "middle" choice? A sensible scheme would have been +ve rail, -ve rail, and ground. This builds upon 100 years of analog electronics and op-amps. Locking a voltage to a rail is extremely easy AND fast.
The benefit of a ternary scheme is that you have LESS power consumption to achieve the same work. Your individual flip-flap-flops are more complex than a binary flip-flop, but you need fewer flip-flap-flops. Overall you'll have fewer transistors and subsequently less heat than the equivalent binary circuit.
The fact that fewer transistors are required to achieve the same work (despite the fact that there are more transistors per gate) will INCREASE the yields. This DECREASES costs.
How in hell did your post get modded up?
Re:Fascinating, but not practical, here's why: (Score:1, Insightful)
Ternary is cheaper for mathematics, not engineers (Score:2, Insightful)
So system with 2*3=6 transistors can count to 9 in ternary while in binary only to 8. When searching maximum for f(x) = x^(const/x), one ends up with e for all const>1. That's why the mention that the 3 is the closest to e - an number base ideal. I remember having that case in mathematics competition way back in 8th grade.
In engineering practice, that is quite far from truth. In ECL logic the ternary half-adder requires the same number of transistors as in binary logic. It requires the same number of wires to carry ternary digit as binary one. However we all know why is ECL nearly extinct - its high consumption prevents high integration.
The benefits of binary logic can be seen in CMOS, where we have two states, each of which consumes also no power and still has low output impedance.
Petrus