Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Science

Gears, Computers And Number Theory 69

UncleJosh writes: "The latest issue of American Scientist has an interesting article "On the Teeth of Wheels" about the relationship between gears, computers and number theory." It's as much a well-meshed detective story (hunting for books and obscure references) as a historical and mathematical introduction to the science of gears. Prime numbers, watchmakers and the Fibonacci sequence all play a part.
This discussion has been archived. No new comments can be posted.

Gears, Computers And Number Theory

Comments Filter:
  • by Anonymous Coward
    I thought that the punchline was "integer number of teeth"...
  • I've seen a gutted (but working) analog autopilot that was used in fighter planes like 50 years ago. It's all mechanical gyros and electrical circuits, but no electronics. Since there is no processing involved, the response of the autopilot is instantaneous as compared to a computer which would need a few milliseconds to process data. However nowadays they just use three nanomachined accelerometers that give feedback to a CPU and the whole setup is much lighter. I also heard from someone working at Sandia that they recently made an actual nanomachine gyro that worked in the same way as the big analog autopilot I saw. Still I think it is pretty amazing, this analog autopilot looked much more complicated than a simple solid state processor...
  • I consider myself an average /. geek and not a mathematician, and I thought it was pretty cool. Most interesting was seeing how recently a mundane trade was priming the pump for new kinds of abstract thinking.

    I was disappointed, though, that the author was unable to track down Brocot's more "theoretical work", since this gear-making example seems to have encompassed a full circle of problem solving -- trial and error, thinking by algorithm and thinking about algorithms. I would like the author's perspective on how the latter two relate in this case.

    I have not read the Cryptonomicon. Has anyone read The Advent of the Algorithm by David Berlinski? The prose wanders between "brilliant" and "snotty euro", but mostly the former.

    In the opening, he brashly claims (from memory) "The first great contribution of the West to science is the Calculus; the second is the Algorithm. There is no third."

    Is this refutable?

  • Alright, so sue me, I got my degree in Political Science and Music...Math wasn't my strong suit. Are you finished with my ego, or do you want to stomp some more?
  • by Smudgy ( 144144 ) on Saturday June 17, 2000 @06:25PM (#995460)
    There are certain problems which are considerably easier using analog (though not necessarily mechanical) computers than to use a digital computer. The one that leaps to mind is a particular minimization problem-- given three cities, let's say, at the vertices of an equilateral triangle, what is the shortest amount of road needed to connect them? The answer is to do this: make a new vertex at the centroid of the triangle and connect roads from each of the three cities to that new vertex.

    This uses considerably less road than, say, connecting city A to city B and then city B to city C. The problem gets considerably harder if you are not using a regular figure or if you are using more than three points. It's really a calculus of variations problem (I think) -- you're minimizing over an infinite number of paths so this makes it a pretty tough problem for a computer.

    This problem is a piece of cake, however, if you use a very special analog computer -- namely soap bubble solution. If I make two plexiglas plates and put pins between them representing the cities I am trying to connect, then dip the construction into bubble solution, a soap film forms connecting the pins. As long as the soap film doesn't close on itself (that is, as long as it doesn't make a 'real' bubble) you will get a solution to the problem. The number of solutions to a given problem grows depending on the number of vertices, but suffice it to say it's a lot quicker to check all the solutions using soap bubbles than it is to use a computer. Also, each solution is quite close to the absolute minimal solution, so within certain parameters it may not be necessary to check every solution. My old differential geometry teacher tells me that they actually use the soap bubble method to do minimization problems for such things as highway planning.

    There's a lot more interesting stuff about this problem which I shan't go into, partly because I don't remember and partly because it's not very relevant to the discussion at hand. In any case, this is a good example of an analog 'computer' being demonstrably faster than a digital one.
  • There's been some work on asynchronous computers, even more on asynchronous implementations of algorithms. If you do a google search you'll find a bunch of hits.

    There is some overhead in asynchronous circuits. The circuit needs to be able to send a signal which can act as the clock upon completion. Depending on what the circuit is supposed to do the overhead can be considerable when compared to a synchronous circuit running at its maximum speed.

    In addition gates often can't start processing until all the inputs are fed. Consider a simple AND gate with inputs A0, A1, A2 and A3 and an output Z.

    If A0 comes in first and comes in high there still is not enough information to start computing Z. In fact until the very last input bit arrives or a zero Z can not be computed.

    I'm not an expert in asynchronous circuits, take what I've said with a grain of salt. I've worked with a few people who explored asynchronous circuits and this was the state at that particular point of time.

    I suppose that if there were compelling benefits for asynchronous logic for microprocessors we would see them: power is related to the clock rate, synchronous circuits are always clocked, asynchronous circuits only clock when they're doing useful stuff, hence power should be lower. If you can get the overhead for the completion flags low then theoretically you should be able to run faster etc.

    It seems that asynchronous microprocessors are always "just around the corner".

  • Yes, the Curta are quite cool. For the acquisitive geeks out there, EBay has a fairly steady trickle of both Type I and Type II units available...
  • This idea makes a hell of alot of sense to me. Always has. Clock the *results*! Purely combinatoreal (sp?) logic definitely rocks.
    My personal opinion is that we (the open source community) should construct the first practical neurally optimising compiler (which would make this architecture practical) , and patent the fucker, so those parasitic capitalist bastards OUT THERE can't get their clammy hands on it.
    This is DEFINITELY doable.
    Ideas, anyone?
  • The SAAB F&lt:number> floating-point co-processor from 19mumble (vague guess: 1968) was all-async.

    It was even used as prior art a few years ago, a US company wanted to patent async FPUs and, guess what, the patent wasn't granted!

  • In the opening, he brashly claims (from memory) "The first great contribution of the West to science is the Calculus; the second is the Algorithm. There is no third."

    Is this refutable?

    Refutable? Not without a strict definition of great. But you hit the nail on the head when you said, "snotty".

    I would say the scientific method, and empiricism itself, qualify as great contributions. Hell, deductive reasoning. And personally, I'd be kind of proud if I'd discovered evolution, or DNA, or atoms.

    Anyway, who exactly does he mean by, "the West", and to whom is he contrasting them? I seem to recall the Chinese were developing calculus about the same time as Liebnitz and Newton... so maybe the west has had only one great contribution. :-)

    One must wonder what contribution Mr. Berlinski has made to science, to so judge the contributions of others. I guess /. doesn't have a monopoly on technological ingrates!

  • It seems to me that this goes beyond bloated software, bloated software we can live with eventually, because it somewhat keeps pace with the advance in computer power. This case is somewhat different.
    Correct me if I'm wrong but looking from a complexity point of view, the algorithm suggested by the author (".... If you need to approximate some ratio, just have the computer try all pairs of gears with no more than 100 teeth. There are only 10,000 combinations; you can churn them out in an instant. For a two-stage compound train, running through the 100 million possibilities is a labor of minutes...") has a complexity of O(a^n), n being the number of cogs in a train, a being somewhat constant, in this case 10,000. According to this, computing the ratio for 3 cogs would take 10,000 * "[the] labor of minutes", 4 cogs would take 10^8 minutes at least and so on. Not a very easy task anymore. You cant realy just throw calculation power at this, sorry.
    Now, I Dont know if using the 'old' method improves the performance by much, it seems to do that. Anyone wants to calculate an average complexity for it?
    So maybe this nice, deterministic algorithm (can someone just prove it will definitly end in a, hopefully, linear number of steps) is not that horrifyingly futile? and maybe, just maybe, it is worth the bother of being clever.

    just my 2 agorot.
  • Yeah, thats definitely funnier.
  • Look here [algonet.se] for a very barebones web server. It understands as little as it can get away with (HEAD and GET, sort-of both HTTP 0.9 and 1.0), but it can do redirection, on-the-fly gunzip files and read from named pipes. It doesn't now squat about sockets, so if you want to run it, run it from inetd.

    Why do I mention it? All in all, it's a whopping 4035 bytes.

  • The Swedish Navy developed a programming language in the 1950's called Kvikkalkul, in honour of Plankalkul. The language was entered on 5bit Baudot paper tape, and lacked any alphabetic characters. It used solely punctuation and numerals.

    The code looks like this:

    1030:
    .8 (- ,0
    1040:
    -) :3
    1050:
    .8 (- .8 -/- ,0005
    .8 ( ,49975 -) :2
    .9 (- .9 -/- ,0005
    .9 ( ,49975 -) :1
    :1 -) 666
    -) :1

    It actually had some halfway decent flow control, and even pointers.

    What is ultimately very scary is that the Swedish Navy was still using this language as late as 1991. And, code written in the 50's still compiles on the "modern" compilers. Now they have OO and some GUI functions in it, but still no alphabetic characters.
  • http://www.cl.cam.ac.uk/Research/Rainbow/projects/ selftimed.html

    Fascinating stuff; it does have the potential to be faster, but processors must be designed in different ways to exploit the varying speed of different operations. Then there's waiting for the RAM ...
  • ...in such a relatively short time, that it's very difficult to comprehend how *different* things were, not too very long ago.

    I went through most of high school with a slide rule; it was a *very* big deal when I bought my first calculator, for hundreds of dollars, from Sears...

    And yet, before that, a whole lot of computational stuff we now take for granted was all done mechanically.

    Check out TIDE-PREDICTING MACHINE No. 2 [noaa.gov]

    "This machine was designed by Rollin A. Harris and E.G. Fischer and constructed in the instrument shop of the U.S. Coast and Geodetic Survey.

    It was completed in 1910 and replaced the Ferrel Tide-Predicting Machine in 1912."

    "The machine summed 37 constituents and was capable of tracing a curve graphically depicting the results."

    Whoa! So you don't have to write down the output! Now that's a feature! But don't laugh! It was necessary to write down the output on previous models.

    "It is about 11 feet long, 2 feet wide, and 6 feet high, and weighs approximately 2,500 pounds."

    And this was state-of-the-art, at the time!

    t_t_b
    --

  • Does anyone remember "Blip: The Digital Game"? It was an electro-mechanical game of Pong for one or two players put out by Tomy. The "ball" was a red LED, and the "paddles" were buttons. Blip used gears to generate random numbers which governed the path of the ball.

    "Digital Derby" was another of Tomy's electro-mechanical games. Even though both games had "digital" in their names, the LED was the only digital thing about them; I suppose you could say they were 1-bit games.
  • AK Dewdney had a column in SciAm where he invented the SAC, an analog computer that sorts any finite array instantaneously...
    1. trim the spagetti pieces proportionally to the values you want sorted
    2. grasp all the spagetti loosely in your hand and drop lightly against a tabletop so the pieces are flush with the flat surface.
    3. your array is now sorted!


    The preprocessing can be a bitch, but the actual sort itself is near instantaneous :-)
  • from what ive read so far it seems that for some types of computing doing it mechanically is faster. So if we shrunk the size of the mechanical computer down, would it be possible to have have a working computer that we could use the same way we do do the ones were at now?
  • This reminds me of an "extra points" question on a final exam in a course on operating systems taught by Vint Cerf at Stanford many years ago. Similar setup except the bar was some number of light years long and was rotated at a rate that would make the tip move faster than the speed of light. In this case what happens is that the "bar" (no matter how rigid) wraps up in a spiral around the axis of rotation, with the result that the tip never exceeds (or for real a physical bar, even approaches) the speed of light. I put this answer (that the bar winds up around the axis of rotation) on the exam and concluded with "Rigid rods are illegal in physics, Freud notwithstanding" and got 15 out of 10 points extra credit: I think Vint (or perhaps the grader) got a laugh out of my answer :-)
  • As I was reading this and seeing how these people came to the same intellectual conclusions using totally different creativity, knowledge bases, and independent thinking. It becomes obvious that something like the concept of IP is completely unworkable because people will undoubtedly approach similar technologies from different paths.

    There is really no way IP could be an incentive to creators, but it can do plenty to inhibit them from applying the labor of their work without fear of penalty.

  • This is exactly the type of thing I love to see. This along with the Brookhaven RHIC article in one day makes me very happy.

    It takes me back to the earlier days of /., before the days of the Four-Letter Crusades (MPAA, RIAA, DMCA). Back when you could still find articles on science and technology instead of <contempt>legal depositions</contempt>.

    (I gotta admit, tho, Valenti's depo was kinda funny, up until I got sick.)

  • Coding began completely as an artform; writing routines "by hand" was commonplace, to achieve maximal performance.

    Nowadays, code is left in its most slapdash form (Anyone who needs proof, just watch how fast MS Office apps take to start up, even on my damn P3-700). The compiler is given its opportunity to streamline things, but who unrolls loops anymore?

    Here's the interesting parallel: Look at any machine made before about 1900. Machines and tools were works of art - ornately decorated and painted; each hand-crafted part was a thing of beauty. While primative, to be sure, they will often last forever. Modern machinery is functional and not in the least artistic. Furthermore, it has a real tendancy to break down.

    Look at code. Look at the original UNIX, the original Multics, and all the assorted tools on them. The ispell, emacs, bourne shell, and every other "ancient" program is still around; improved upon but still beautiful. These old program will run and keep running, because programmers like Mel put in the love.

    It's not about speed, its about beauty, and I think that this is something that will be less and less evident in software to come, because programmers are getting lazy and drunk on the fast processing power available to us.

    -- Aaron Kimball

  • >What's considered a practical limit on the length of gear trains used for watches, mechanical computers, etc?

    >If trains of four, five, six, etc gears are mechanically practical, the computational problem of choosing their ratios still seems interesting.

    I might have misunderstood you, but you seem to have overlooked that gear trains are normally driven from the bottom (ie to produce a slower rotation at the far end) and not from the top. Thus trains of even 9, 10, 12, 20 gears are not only mechanically practical, but not very difficult either.

    So yeah, there is still some scope for interesting computation, but on reflection, the computation of gear trains is interesting largely because it has solid practical application, but today, any application requiring an accuracy that would take a 20 step train to achieve is going to be done digitially anyway.
  • by mbrubeck ( 73587 ) on Saturday June 17, 2000 @04:15PM (#995480) Homepage
    This reminds me of the classic math joke about a mathematician at a conference. After attending a number of presentations on number theory, abstract algebra, and so on, he was starting to feel that his knowledge was too esoteric, too divorced from the real world. So when he saw a talk entitled The Mathematical Theory of Gears, he thought, "This looks like something that'll help me get in touch with concrete applications."

    So he went into the auditorium and sat down. The lecturer began: "The theory of gears with a finite number of teeth is well-understood. However..."

  • There is a lot about factoring in the article. Is Brocot's AlGoreRhythm potentially useful to speed factoring?


    The regular .sig season will resume in the fall. Here are some re-runs:
  • &nbsp &nbsp &nbsp I was surfing around a little while back when I decided to see if I could find a sliderule for sale somewhere. (yes..I AM a nerd :) During my search I ran across refrences to a mechanical calculator called the Curta.
    &nbsp &nbsp &nbsp According to this [bath.ac.uk] web page, the Curta was designed and built by a gentleman named Curt Herzstark of Austria. Although several prototypes were made, the first production began in April, 1947. The last Curta was made in November, 1970 but they were still sold until early 1973. Over the course of about 20 years approximately 80,000 of the Curta I and 60,000 of the Curta II were constructed.
    &nbsp &nbsp &nbsp Additional links, articles, and pictures of this awesome little device can be found here [teleport.com] and at Curta.org [curta.org]
    I gotta say..The Curta is one sexy little calculator :) thanks to Bruce Flamm at the first link for some of the info.
  • On the whole "here are some other examples" thread, IMHO some of the coolest gear-based computational devices are the various orreries that have been built.
    (And there's more of them to look at than number-calculation devices :-)
    And for an orrery, virtually every gear-ratio is an approximation of a non-factorable ratio, so I found the article of particular interest because I'm currently working on one at home. (Though it's a desktop sort of thing, I aspire to eventually do something along the lines of Aughra's awesome device [130.126.238.131] <grin>)

    The only real link I've got on hand is this one: Brian Greig's Orrery Page [geocities.com]
    (He makes orreries for museums, collectors etc, and some of them are pretty cool :-)

    BTW, for those that haven't seen much of these things, an orrery (named after the Earl of Orrery, who commissioned one of the first built) is a device that shows the motion of the planets to scale (but not the size of the planets to scale...). And like the calculation engines, orreries today are done through software.
    If you know a bit about the complexities of planetary motion (eg non-circular orbits, inclined orbits in which the plane of inclination drifts or rotates), seeing the various means of incorporating these aberrations into a clockwork model is quite fascinating.

    One particularly nagging thing about the article was the assumption that the problem is finding the best gear ratio. Ha! The best ratio might be 103:17 but have you ever tried to find a gearcutter? The last one I saw was in a museum (I must have been a pathetic sight - pressed up against the glass like a kid outside the candy store...), which means I have to buy manufactured gears. Which means finding the best gear ratio out of the gears available to me. Sure, it cuts down on the computation, but you need to make a longer gear train to get even remotely close :-(

    Ah well.
    It seems a shame that the skills and tools of so many of these crafts are dying or dead (if only because they could make amazing things that modern manufacturing methods are currently simple incapable of producing).

  • Well, electrons are almost always faster than mechanical devices.

    Take an infinitely rigid bar (or near-infinitely rigid) which is about four light-years long, and weighs almost nothing (a few pounds, maybe a few tons) and put it between Sol and Alpha Centauri. Wiggle it back and forth, and you have instantaneous morse code across four light-years.

    It has to be nearly infinite in its rigidity or it will bend and the girl on the other side won't see movement, and it has to be light-weight or you will have trouble keeping it straight (a few pounds with a 2 light-year moment arm will be a pain to keep straight).

    This is the best example I know of where mechanics wins over electronics/optics/sub-space/new-tech.

    Louis Wu

    Thinking is one of hardest types of work.

  • Since there is no processing involved

    The question is about computers - if there is no processing, there is no computation, hence it isn't a computer.

    But a good example.
    --

  • Cal Poly, San Luis Obsipo. The student section [calpoly.edu] of ASME [asme.org] (American Society of Mechanical Engineers) has had gears on four of the last five T-shirts we have made.

    Disclaimer: I was the Chair of the club last year.

    Louis Wu

    Thinking is one of hardest types of work.

  • >Is that Aughra's orrery for real? It looks like the product of a mad scientist!

    Yes and no. It's from the film "The Dark Crystal", and depicts a fictional solar system, however, they built the thing full size and motorised it etc. So it's real in the more important sense of "I could have something like that in my bedroom".
    (A good defense against burglars :-)
    A picture of Aughra:
    http://130.126.238.131/Sean/movies/dark_crystal/ pictures/Aughra.GIF
    Lots of Dark Crystal pics:
    http://130.126.238.131/Sean/movies/dark_crystal/ dc_pictures.html
    Dark Crystal has just been re-released on DVD, so if you want to see the orrery in motion, you know what to do. (I also recommend "Labyrinth" - made by many of the same people and is a better film as well).

    >I suppose suspension of the bodies would be a problem.

    Surperconductors reflect magnetism, so if you could make a superconducting disk (cooled from beneath) several metres wide...
    (And you'd get Tacky Mysterious Fog drifting off the device in the Bargain :)

    On the other hand, having the huge brass rods and crescents and stuff has an appeal of its own.
  • Unfortunately, I was already "engaged" with CS at that time.

    But yes, I realized that I wanted to study CS because it looked like art, and yet it was something practical. But after the Software Engineering course (what! is it ENGINEERING after all?) I was sadly aware that CS was headed to the "no artists here, just things that work well enough" paradigm instead of the "pauca set matura" (few but mature) paradigm that was the motto of K.F. Gauss, my favorite mathematician.

    oh, well!

  • Yup, I remember tearing apart my Blip as a kid and just watching the gears and pinions move around. (It was pretty easy because the 'ball' only moved in a straight line and only changed angles in the middle, and there were only three end points on each end. It also was kinda buggy, iirc.)

    It is a strange relic though, especially because electronic games were already popular (ohhh, Mattel Football..)
    --
  • When pushing the bar, you basically compress the near end of it. This compression moves along the bar like a longitudinal wave until the remote end moves. For small bars or very rigit bars, this looks almost instantenous, but for your example, it would actually take more than 2 years until the remote end moves.

    If you were able to do an infinitely rigid bar you might get instantenous communication (infinite speed of sound), but the trouble is, there is no such thing as an infinitely rigid bar...

  • Yup. I think this stuff is pretty interesting, would enable a nice interactive 3d demo of a working watch. Brute force ain't a solution in the new worlds of immersive 3D systems, the interfaces that are evolving to consume all these processing cycles. In the old days, it used to be fun to replace the unix idle code with an approximator for PI, i.e. when your machine wasn't doing anything else, it found digits for PI. Nowadays, when it's idle, it uses PI a gazillion times to draw 3D things all over your screen. In theory, the "standard user experience" is fundamentally driven by systems where the full capacity of the system is exploited to make the users life easier, more productive, whatever.
    So brute force is a cheap temporary system in those brief periods where user demands don't exceed machine specification. Most of the time, software is in the position of scrambling to keep up. In my world, knowing how the mechanisms of gears works and successfully communicating it to a machine means that the machine can now do it efficiently for many cases. Brute force means I can't maintain frame rate.
    When window based systems became popular, awesome DOS machines turned into absolute dogs under windows, cause the demands on them went up an order of magnitude. The same thing will happen again as 3D moves from games into the general environment, once more it's gonna be shaving cycles.
    Given this, the only justification I've ever seen for brute force is as a temporary solution while you make the real one, or because it's an npc problem and you're just gambling anyway. Over time the effective lifetime of software, I'd argue that the lifetime of a knowledge based solution always exceeds the lifetime of a brute force solution.
    I guess I'll end my rant now, it's just that I think anyone who buys that is by definition a dinosaur, i.e. they have only the present moment, no future at all.

  • Gee, ahh, duhhhh. I had this bad thought too late, went and checked, and yes, if anybodies seen my sense of humor please return it to me. I have stenciled the meaning of sarcasm on my right palm and do understand that the initial poster is not recommending we all become porcine programmers wallowing in wasted cycles.
  • I personally enjoyed the comparison in Stephenson's Cryptonomicon much better. This one is definitely "geared" more toward math majors and actual practicing mathematicians instead of the aveage slashdot geek. Very cool, but the guy needs to work on his prose, IMHO.
  • by Grant Elliott ( 132633 ) on Saturday June 17, 2000 @03:00PM (#995494)
    I did some research in the field of mechanical computers a while back. (Right after I built my own : ) It's an interesting field. Anyway, I thought someone might want to see these related sites on the history of mechanical computers.

    The History of Mechanical Computers [best.com]

    Early Calculators [hpmuseum.org]

    Zuse [geol.uib.no]

  • (Who has a weird sort of cult following ... I remember when I was doing a lot of amateur archaeology in Austin finding the contents of this site (or something similar) and reading the whole thing in fascination.

    timothy
  • In certain applications, like raising an arbitrarily high number to a very high non-integer power, if one were to create a very specialized machine, would it be possible to create a machine that could give the high accuracy in less time? Of course, you'd probably need a machine that could spin at a very high speed in order to get the necessary clocks, but if one were to use numbers that could be expressed rationally, but are very large, it at least in my semi-limited knowledge, could be a savings of time.
  • Keep away from the wheels! They have TEETH. We ask also that you keep any small children away from wheels, especially while in motion. Even though the wheels MAY look harmless enough, they have consumed a fair amount of mathmaticians ('s time?).
    Also, please don't feed the wheels: they become enourmously fat and refuse to reproduce. We have been issued various legal threats from auto manufacturers about this. You WILL BE FINED.
    Thank you. (forgive my spellign!)
  • Isn't Jack Valenti [mpaa.org] one of these units? I mean, he's a big wheel in the MPAA, and he seems to be biting a lot of people on the ass as of late.
  • by Guppy ( 12314 ) on Saturday June 17, 2000 @04:31PM (#995499)
    "Working through examples of Brocot's process by hand, and leafing through the pages of the printed Brocot table, leaves me feeling wistful and uneasy. The ingenuity and diligence on exhibit here are certainly admirable, and yet from a modern point of view they are also tinged with a horrifying futility. I am reminded of those prodigies who spent years of their lives calculating digits of the decimal expansion of pi--a task that is now a mere warmup exercise for computer software..."


    After reading through the Scientific American article, I suddenly found I wanted to re-read The Story of Mel [tuxedo.org] again, the tale of a programmer's programmer from an era gone by. Our old-timers often lament the extinction of code laboriously hand-tuned to run tight and fast on elegant machines from days gone by -- and those days have been gone only a few decades. The gear makers worked their craft a century or more ago.

    Today, sometimes I wonder, what was the point? Why not just shovel in and ship out the first thing that works? A year and a half from now, the hardware will be twice as fast, and probably cost half as much. The software we wrote will be obsolete, as will be the hardware it ran on.

    But maybe it does matter. It would be a terrible thing if our decendants did not surpass us. But even as they gaze back upon us from those lofty, distant heights, maybe we can give them a reason to listen to how it was done in the Good Old Days.



    "Lest a whole new generation of programmers
    grow up in ignorance of this glorious past,
    I feel duty-bound to describe,
    as best I can through the generation gap,
    how a Real Programmer wrote code..."

  • I'll second that -- Konrad Zuse's work on electromechanical computers is fascinating. He developed the first stored-program computer in Germany during WWII, with bombs falling and severe shortages of materials. (At one point he "liberated" copper from a public power line for his machine, later realizing that the police/soldiers would have killed him on the spot if they'd caught him.)

    I recommend reading his autobiography The Computer - My Life [amazon.com] . It's entertaining and informative (unlike many autobiographies).

  • After read about mechanical computers, I recalled having learned about asynchronous computers. Have asynchronous computers been built that run a lot faster than conventional clock-controlled computers? As events in an asynchronous computer are triggered by a previous event, and not a central clock, it would seem to me that they could really scream, if timing issues could be worked out.

    ---------------
  • reminds me of math papers written this year that still have "example fortran code". Errrrr..

  • After reading this article, I went to look things up at Eric Weisstein's World of Mathematics [wolfram.com] (courtesy of the makers of the great but ridiculously overpriced Mathematica).

    Anyway, here are some quotes from articles in mathworld related to the original article:

    Stern-Brocot tree [wolfram.com]. "A special type of binary tree obtained by starting with the fractions 0/1 and 1/0 and iteratively inserting
    (m+m')/(n+n') between each two adjacent fractions m/n and m'/n'. The result can be arranged in tree form as illustrated above. The Farey sequence Fn defines a subtree of the Stern-Brocot tree obtained by pruning off unwanted branches (Vardi 1991, Graham et al. 1994)."

    Gear curve [wolfram.com]. "A curve resembling a gear with teeth given by the parametric equations x = r cos t, y = r sin t, where r = a + 1/b tanh [b sin (n t)]."

    Phi, the golden ratio [wolfram.com]. "A number often encountered when taking the ratios of distances in simple geometric figures such as the pentagram, decagon and dodecagon. It is denoted [phi], or sometimes [tau] (which is an abbreviation of the Greek ``tome,'' meaning ``to cut''). [phi] is also known as the divine proportion, golden mean, and golden section and is a Pisot-Vijayaraghavan constant. It has surprising connections with continued fractions and the Euclidean algorithm for computing the greatest common divisor of two integers."

    (Note: the above quotes from mathworld fall under the definition of "fair use". Please don't sue me!)

  • These Brocoult trees are also known as Farey sequences by the mathematicians.
    The farey sequence is known to naturaly enumerate the buds of the mandelbrot set. Julie Tolmie at math anu edu au has just finished a PhD thesis [anu.edu.au] in this area.
    Check it out; lot's of great pictures.
  • No. Factoring is hard (we suspect it is anyway).
    What this algorithm is good for is understanding the _geometry_ of ratio and, more importantly, the geometry of multiplication of ratios.
  • Julie Tolmie has just submitted a Ph.D. all about visualization of these numbers, also known as the Farey sequences (it is what Knuth calls them!). Read about it (or just look at the pictures) here. [anu.edu.au]

  • Mechanical computers can be faster, but unfortunately the setup time is far worse. The consequence of this is that you can demonstrate a fast analog solution of a simple case of a hard problem, but you lose on the setup time as the solution space gets larger.

    Instantly finding the largest number of a set where numbers are represented by lengths of uncooked spaghetti is as easy as setting them on end, & holding the longest while letting all the others fall away, but after a few thousand different numbers, accurately trimming all that spaghetti becomes difficult.

    I'd like to know how mathematically well supported this idea is. I regard it as a basic theorem, because otherwise it seems to allow easy solution of NP-Complete problems, but would love to see it investigated.
  • For small bars or very rigit bars, this looks almost instantenous, but for your example, it would actually take more than 2 years until the remote end moves.

    It seems that the sound waves in the finitely-rigid bar are traveling at twice the speed of light. Alpha Centauri is about 4 light-years away, and if it took only 2 years for the wave to get there ...


    Louis Wu

    Thinking is one of hardest types of work.

  • With so much calculating power at your fingertips, it's hardly worth the bother of being clever.

    I think I've read a few /. articles that say the same thing about programming languages and the hardware they control. With 80 bazillion cycles/second, nobody really cares anymore about how efficent their code is.nWith a "standard" computer having 64 megs of RAM (vs 64k) nobody seems to really care if their software is bloated or not.
    Just imagine they type of software that would be seen if software engineers, like ourselves, actually tried to be clever.
  • As long as each successive gear chain "link" reduces the final ratio further, it should be infinite. Rather, it *could* be infinite, as in, an infinite number of gears.
    This would make an interesting physics problem. I'm sure it's already been done somewhere.
    You'd be adding up a series of terms based on the drive ratio of each pair of gears (where each driven gear has a pinion attached that drives the next driven gear), figuring out the speeds and thus the amount of power required to turn each gear which depends on the speed each gear is turning.
    So, as the number of gears approaches infinity, what does the function for the required power input look like? Assuming it's a simple scenario where all of the driven/pinion gears have the same ratio.

    Now if you're increasing the speed with each gear, the power required will skyrocket, friction will take over and you'll break your legos. Fortunately, theoretical physicists and mathematicians only have to deal with massless, unbreakable gears with precisely known friction functions...
  • Is that Aughra's orrery for real? It looks like the product of a mad scientist! I wish I could acquire the metalworking skills necessary to create something like that.

    One other question; would it be possible, or make sense at all, to try to make an orrery that used some other force (say, magnetism) to simulate gravity? I suppose suspension of the bodies would be a problem.
  • Many years ago, I remember having a set of Lego Technics which included a bunch of gears. I had a good time building gear trains, but only up to a certain size -- ultimately, it became too hard to move the gear train.

    What's considered a practical limit on the length of gear trains used for watches, mechanical computers, etc? At some point, the amount of force needed to overcome the static friction of the shafts would break the teeth off the gears first. Even before that, the force required to keep everything turning might be too high for a spring/watch battery/other power source to produce for any useful length of time.

    If trains of four, five, six, etc gears are mechanically practical, the computational problem of choosing their ratios still seems interesting.
  • Uuuuuuuh... in case you haven't noticed it, this is an article from American Scientist. It's not intended for the "average Slashdot geek", whatever that is.

    And frankly, it's nowhere as math-deep as it could have been; it's left me wanting more mathematical detail, and I'm a mere "computer scientist".

    "The guy"'s prose is fine; he weaved the story rather well, reminding me more than once of James Burke's Connections column on SciAm. He also did a good job of explaining the mathematics; I think anyone with more than an elementary understanding of it (i.e. who knows what a fraction is) would have had a pretty easy time reading the article.

    I would have hoped that "the average slashdot geek" (which is what you seem to consider yourself to be) would have a more firm grasp of basic math, especially considering that it's where all of "computer science" comes from. Maybe I'm just an optimist. Ah well.

  • Ouch. Can I have my pride back?
  • In the 1940s, Zuse also designed "Plankakül", which is widely considered to have been the first algorithmic language. It has some of the features characteristic of today's high-level languages.

    The paper The "Plankalkül" of Konrad Zuse: A Forerunner of Today's Programming Languages [Bauer and Wössner, Mathematisches Institut der Technischen Universität München] is available [tuxedo.org] in HTML form at Eric Raymond's Retrocomputing Museum [tuxedo.org]. It describes Plankakül in excruciating detail... it's a very fun read (if you're into ancient and bizarre programming languages, that is).
  • It was pretty cool seeing Viswanath's name in the article since I took a class he taught on formal languages last fall.

    Getting back on topic, a lot of the sciences have had deep interplays with society. For example, some important thermodyanic ideas and relationships were discovered by people trying to perfect beer brewing (I believe it was Kelvin but I'm not sure). Another example is theology in northern england, which was one of the major influences in the developement of the idea that energy is conserved in physics. A similar connection exists between quantitative science and accounting.

    Despite our superficial impressions, scientists have often used influences and concepts from society at large to formulate their theories. In this respect, the sciences are socially constructed. However, the are not entirely social constructions regardless of what some say.

  • Analog computers were widely used for fire control problems, such as aiming anti-aircraft guns or firing torpedoes. The earlier systems were electro-mechanical, with gears, motors and cams. The fire control systems on battleships were supposed to have been very complex mechanical computers. I've never found a detailed description of them, probably because their design was considered to be an important military secret. The mechanical bits were largely replaced with operational amplifiers, and then with digital computers. I believe there are still some analog optical computers used for target recognition in missile seeker heads.
  • would it be possible to create a machine that could give the high accuracy in less time?

    It is almost certainly possible to build a mechanical device which can do certain, specialized computations faster than the general purpose computers we have now. The real question is "is it practical?" The answer to this is most likey "no." Even if it is, an electronic counterpart to the machine could be built to be even faster than the mechanical device. You may be able to escalate this a for a few more iterations, but the electronics will win out - you'll always be able to move electrons faster than gears, levers, &c.
    --

  • This one is definitely "geared" more toward math majors and actual practicing mathematicians instead of the aveage slashdot geek

    Eh, I don't know where you went to school, but the math in this article doesn't go beyond high school levels. At least not on where I went to high school.

    I'm not sure what you mean by the average slashdot geek. Just because it doesn't mention Linux or free software doesn't mean it ain't interesting.

    -- Abigail

  • A wonderfull article. Thanks to the poster for ferreting out such a gem.
    I was especially hit hard by the last para in which the author points out the fact that computers can now solve the same by brute force.
    But it gave me hope as I had always been worried that at the rate at which our knowledge base is growing a day would come that noone would be competent in his field i.e. it would take up all ones life to get competent in a field before one could actually do some new work.
    But as this example has pointed out in years to come we just wouldnt have to master the old basics, we could hand them over to a brute force machine and concentrate on coming up with something new.
    Incidentally the author mentions fractals but I was not able to get any reference to vibonacci and fractals. Can somebody help me out?
  • Thanks so much for mentioning The Story of Mel.
    That *has* to be my absolute favorite part of the jargon file. I was just mentioning it to a bud of mine that I'm attempting to seduce away from the Dark Side. I'll shoot him the url. Hope he appreciates it.
  • "...Many years ago, I remember having a set of Lego Technics..."

    After reading your post, I could only wonder... How would Legoland history be different if Mindstorms had never been invented? Imagine an alternate future, in which the Lego Babbagestorm basic kit comes with 10,000 pieces (9,900 of which go into the differential engine).
  • Nah; given the current Lego trend toward sets with a few specialized pieces that can't really be used anywhere else--as opposed to the original vision of building things out of hundreds of generic pieces--the Babbagestorm set would come with about 5 pieces that couldn't be used anywhere else.

According to the latest official figures, 43% of all statistics are totally worthless.

Working...