Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Science

Satellite's Circuits Emulate Nervous System 23

desslok writes "A new type of attitude-control system will be put to the test this August when it is launched into orbit as part of the Swedish Hugin satellite. The new board, developed by researchers at Los Alamos National Laboratory, in N.M., comes directly out of research that uses analog electronics to simulate the nervous systems of real animals. "
This discussion has been archived. No new comments can be posted.

Satellite's Circuits Emulate Nervous System

Comments Filter:
  • I thought the all important question was:

    Is it based off of alien technology out of Roswell? -NG


    +--
    Given infinite time, 100 monkeys could type out the complete works of Shakespeare.
  • It sounds very much like the kind of force-feedback stuff that was done before digital got popular. Is this really new? OTOH, has anyone heard anything more about the guy using Field Programmable Gate Arrays(FPGA) programmed using genetic algorithms to generate extremely efficient feedback/neural net type circuitry on a single chip? I think the article I read was in Discover Magazine some months ago (even stripped of the typical Discover over-hype, it sounded quite interesting, and quite a bit more newsworthy than this.)
  • ... someone with a clue has built this sort of thing.

    This will be really handy for exploring the likes of Mars, dismantling old nuclear reactors, and whoo knows what.

    I wonder if anyone's simulated this type of "behaviour" on a (digital) computer? Imagine if such a program, responding to external stimulii could reprogram itself in response and "grow more circuits" ie learn.

    Perhaps we have the first tentative steps to passing the Turing test?

    Time for humans to sit back and let the machines do all the work.

    Well, maybe not.
  • It's not related to fuzzy logic. You're correct in that fuzzy logic uses analog values rather than discrete. Normal boolean logic assumes a statement is true or it is false. Like a simple thermostat, if the temperature falls below some threshold, turn on the heat:

    if(Temperature 60) { /* is it cold? */
    turn on the heat
    }

    Fuzzy logic assigns a value for the truth of the
    statement. Rather than just being cold there is a degree of cold, and there can be a degree of hot. So based on the relative strengths of the two assertions you can control the amount of cooling or heating provided in a continuous manner.

    A lot of respectable engineers feel that fuzzy logic is bunk, look up some of Bob Pease's articles in Electronic Design for some not so favourable reviews. A lot of others don't.
  • The use of FPGAs and other reconfigureable hardware along with genetic operators on the configuration is broadly lumped under the category of "Evolveable Hardware" and most of the people doing cutting-edge work in this arena will be at the GECCO conference down in Orlando next month. You can expect a few news blurbs based upon papers presented at this conference soon after...

    BTW, the guy who did a lot of the initial work on this whom you are probably thinking of is Adrian Thompson. Alternatively you could have been thinking of Hugo DeGaris and the smoke and mirror game he has been running with the press for several years related to his CAM-Brain project (this is a variation of EHW whereby a cellular automata is used to generate random neural pathways/connections and then the system evolves the weightings.)
  • by Irish96 ( 8810 ) on Monday June 14, 1999 @12:47PM (#1851326)
    The article mentions Mark Tilden and his colleagues as the pioneers of this technology. Some of you might remember Mark Tilden as the creator of Beam Robotics:

    http://nis-www.lanl.gov/robot/
  • >I wonder if anyone's simulated this type of >"behaviour" on a (digital) computer? Imagine if >such a program, responding to external stimulii >could reprogram itself in response and "grow >more circuits" ie learn.
    Actually the only thing kinda like revolutionary to this is that they did it without simulating it in a digital computer. Systems that emulate neural behaviours are being studied for several decades now at AI labs and universities. One special point of interest has been the growing of neural networks that you mentioned, but the results are not very encouraging until now. Just the last couple of years the growing of neural networks using genetic alghoritms has produced some results that actually solved some problems, but simulating evolution may take some time... (Like 5 billion years or so)
    >Perhaps we have the first tentative steps to >passing the Turing test?
    Not likely that this will come from this side of AI research, from a bug to a human is a long way. Some Results in this direction are soon to be expected from the language side of AI. (I hope)
    >Time for humans to sit back and let the machines >do all the work.
    >Well, maybe not.
    You were probably sitting back behind your computer when you wrote this :)
  • by substrate ( 2628 ) on Monday June 14, 1999 @01:35PM (#1851328)
    This Digital Nervous Network is based on work by Mark Tilden as the article mentions. What wasn't really mentioned (aside from it being built from el-cheapo parts) is that there is a large hobbiest community who builds these things.

    Essentially they're based on wiring an even number of inverting stages together. Normally this would settle on some ugly analog value that the gates really aren't designed to. By letting motors perturb the gates inputs via RC coupling the outputs of the gates will go into patterns of digital signals. In the proper conditions these signals can be amplified to drive motors in a walking type motion. Further perturbations change the gate of the walk etc.

    The community is called BEAM robotics [lanl.gov].

  • The guy I was thinking of was Adrian Thompson, and out of curiosity, I went to Discover's site and looked in the archive. The article in question is archived here [208.226.13.177].

    Jim, since you sound pretty savvy on the subject, maybe you could make sure something gets submitted to /. after the GECCO conference? I for one would be very interested, and it certainly is news for nerds.

  • How about StarBridge Systems HAL computer [starbridgesystems.com]? It was in Slashdot a while ago (where are the archives?). Remember, it outperforms the largest supercomputer, yet sits on a desktop. Also, you could fire a bullet through it and it would keep running. It uses FPGA circuits.

    The guy behind this is named Kent Gilson. We haven't heard much from him lately.

    (I think you could even run Linux on it.)
  • by Anonymous Coward
    I find it interesting that this is the first beam concept to make it to orbit. esp. considering all the stuff that went into pixelsats.
    (for the layman, a pixelsat was a concept for a simple spy satelite with very low weight, cost, and resolution. the idea was to launch a whole bunch and network them -- beowulf imaging if you will. The thing that had to do with BEAM was that they utilized a device known as a solarengine.
    Imagine a way to amplify and store electricity from solar panels to run a system intermittently in lower-than-optimal light. It's a pretty simple cicuit that can be built from cheap components that are fairly space hardy without modification. the pixelsats were to use the current from the SE to drive a magnetic coil and maintain position by thrusting against the earth's magnetosphere, a design based upon the MagBot, a pretty little piece of robotartwork that used a slick, thin surface and another magnet to thrust against.)
  • Seriously, folks, I think we're seeing the beginnings of a renaissance in thought about the incredible capabilities of analog systems in particular applications. (This is not a slam at digital, but the trend over the past decade or so has been to do everything in digital, whether it makes sense or not.)

    Analog processing is simply better for some things than digital, and it's been my firm belief that when technology allowed sufficient miniturization and flexibility in analog circuitry, we would see a resurgence of interest in analog techniques, especially in areas such as sensing and response.

    My original background is in robotics and automation - one reason I left the field was the nagging feeling that we were being forced to work with a kludge - it was apparent even in the mid-80's that we needed something like neural systems to significantly advance the state of the art. A hybrid analog/digital system based on Rodney Brooks-style control layers would offer a reasonable promise of a robot that can successfully mimic a rudimentary "intelligence".

    I'm going to go out on a limb and say that we may truly be witnessing the beginnings of a renaissance in analog computing and signal processing. Announcements like the one above, and the fact that people are finally succeeding in building microscopic low-power vacuum tubes (ideal for this sort of stuff), lead me to believe we may see analog applied more and more often in areas where it has real benefits over digital.


    Another take on cathedrals:

  • Technolust. So long as Kate is more important, it's fine with me.
  • The slashdot article on Starbridge systems is here [slashdot.org]. You can find all the slashdot archives at older stuff [slashdot.org].

    As with most of the Slashdot commenters, I dont find Starbridge very credible. I'll believe it when it's sitting on my desk crunching one of my programs and not until then.
    --
  • I'm sorry, it doesn't get anywhere near 165F at 80% humidity in Yuma, AZ. Maybe 130F at 50% humidity. Though I do like the idea of "disposable robots" to do dangerous stuff :)
  • I remember that several years ago the buzz word "fuzzy logic" appeared. Instead of digital signals, communication between modules of systems would use floting point values.

    Now I can see that this differs from what is described here, as fuzzy logic does not require a neural net. But if I understand the article (and discussion) correctly, the networks described here uses analog, and not digital signals. Is that a necessary requirement, or could such a nervous system be built on top of digital signals as well? I am not talking about emulating an analog system using a digital system, which is definitely doable (and probably much more efficient during the development phase).

Work without a vision is slavery, Vision without work is a pipe dream, But vision with work is the hope of the world.

Working...