Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Science

Haptic Feedback Nanomanipulator 34

Tanner Lovelace alerted me to an interesting nano-manipulator in use at U-NC. They've got some interesting work going on right now, but what I found most interesting was their use of the real-time forcefeedback manipulator - the only I've heard about. Check out some of the experiments that have been done with said equipment.
This discussion has been archived. No new comments can be posted.

Haptic Feedback Nanomanipulator

Comments Filter:
  • Actually, we don't use armlib anymore (for those of you who are wondering what armlib was, it was a library to talk to haptic devices. It started out as the library to talk to the Argonne Remote Manipulator, or ARM, and hence the name.) We now use something called VRPN (Virtual Reality Peripheral Network) written by Russ Taylor, the guy who wrote the nanoManipulator. It's code is freely available here [unc.edu]. This is a library that allows you to easily connect and talk to a bunch of 3D tracker/haptic devices.

    Tanner Lovelace
  • To make use of nanoparts there needs to be a power source on scale with them. I know that some research has been done in exploiting the ADP-ATP cycle that cell use for power, but I can't recall a source to site. The basic structure was a ring of similar molecules with another jetting out the top. As the converstion from ADP to ATP took place the center one would be driven around the ring. Sorry for so few details, been a year plus since read the story

    Banfield


    Banfield

    Pavlov's Dog vs. Schrodinger's Cat
  • Quantum mechanical effects are not generally observed in lego, so no tunneling. Nice results from the thing, tho'.
  • This will be an important device once it can come out on the market more. Little devices moving around everywhere doing everything in very small ways.
  • OK, so they can move molecules around. Now if only someone had the parts for a remotely controlled nanorobot, they could assemble the thing. Then they'd be able to use the nanorobot rather than their probe tip.

    I'm sure their server will just love the /. effect from people downloading the 10MB video...

  • or not as the case may be...
  • Cool I always like seeing nifty things to do with haptic schtuff. I do hope you meant that this was the first RT forcefeedback device used in nano stuff. Cuz there are other hapic devices out there. I hope they can use some of the other ones in place of the Phantom too. Its good and all but it gets kinda sloppy with its response. The magnetic ones have much better response times.
    -cpd
  • Slashdot readers will be interested to note that Warren Robinett, credited as having conceived the nanoManipulator, was one of the chief developers of Adventure. Remember the little "invisible dot" you could retrieve to be able to pass through a wall to read the credits for the game?


    How cool is that?

  • They're on the same pipe as metalab - I don't think they'll *notice*. *grin*
  • Actually, one of our guys is doing his dissertation on how to run the nanomanipulator over the commodity Internet. I think, from what I've heard, it has something to do with sending over reference frames of images and doing image warping between them. See Tom Hudson's web page [unc.edu], however, for more info.

    Tanner Lovelace
    lovelace@NoSpAm.cs.unc.edu
  • by Anonymous Coward
    This is a really diverse bunch of folks. Not only do they collaborate with across disciplines with their work, but they are also doing user/collaboration studies on how the system works in those collaborative environment. They are actively trying to understand the psychological and psychophysical aspects of their system to make it an even more effective tool.
  • This is a fascinating project. Wonderful stuff these guys are doing! The haptic device being used by UNC is called the Phantom, which alas costs big bucks. Using cheaper game-oriented force-feedback controllers would be cool, if it would work. A couple of years ago, I did a little research [std.com] into what might be involved.

    A really cool thing would be a piece of client software that knows how to talk to the UNC microscope over the web, and connect it to my force-feedback joystick, showing the image on my browser.

  • See: http://www.cs.utah.edu/vision/virtual_prototyping. html
    (it's currently used for large stuff, but what you actually control could be any size, really)

    On the atom level, they prototyped an artificial eye (lifelike response, persistence of vision) a long time ago, but I don't remember the link ;-)
  • IIRC, there was an article in /. a few months ago about someone who did this very same thing with mindstorm. don't recall what the URL for the site was, but searching through /. might dig it out.
  • Get it at http://www.gravity.org/ [gravity.org]

    Knud
  • Possibly true, depending on your definition of "simulation," but irrelevant. Extension of the same logic would apply to the interaction between the body and the brain, so you could say "You aren't really feeling the cup in your hand, just receiving data from nerves." The probe is in some ways an extension of the human using it, and the data it provides is no more or less inherently misleading than any natural human sense.

    It is certainly not a simulation in a form that causes people to confuse theory and reality. The experimenters certainly understand that the force is amplified and probably inaccurate in other ways, but they can still use the feedback to improve their interactions with the specimen.

    BTW, you are really stretching the definition of "simulation." Is a news video on TV a simulation? Your view through a camcorder? If you look through sunglasses, is that view a simulation? Generally, a simulation is a model created from a theory, not a direct representation of real data, and especially not an interaction with real matter (even if indirectly perceived).
  • Very good work they're doing...

    One of the drawbacks, though, of any tech which adds a new dimension of perceptability to an existing system through artificial means is that humans run the risk of confusing the simulation with the actual object. This is a well-known phenomenon which is studied philosophically and which fluid physicists, telesurgeons, mathematicians, and others who deal often with modeled realities that are becoming increasingly realistic need to take into account in their studies.

    By translating the actual physical phenomenon to another scale and/or another dimension, it is required that one always keep in mind that what they are experiencing is an interpretation. For example, a virus doesn't really "push-back" at you with Xlbs of force, that is just a simulation of the effect of your teleoperated manipulator coming into contact with the surface membrane. Similar problems exist in visualization, audization, and other simulation and modeling disciplines.

    I hope that scientists will find ways to understand these interpretive obstacles and teach them to their students, so that good science will not be hindered by errors in translation...
  • Actually, one of the things we've considered is the following: someone, using lego mindstorms, built an atomic force microscope simulator (that is, a device that images the same way as an AFM, by taking height samples along a regular grid). We thought it would be cool to build one of these, hook the nanoManipulator up to it, and use it to explain how an AFM works.

    Tanner Lovelace
  • You are indeed correct. We still have the ARM in house, and it (at least as of last semester) still works. I used it, in fact, in my final project for my "Virtual Worlds" class (taught by the aforementioned Warren Robinett, who is indeed the author of Adventure for the Atari) to simulate a renaissance rapier. The goal was to make the ARM fool you into thinking you were actually holding a real rapier. It worked, sort of...

    Tanner Lovelace
  • Although the use of the smaller Phantom is new, i think the connection of a force feedback device to an STM dates back a while at UNC. I remember seeing a project at the Foresight Nanotech conference in about '93 or '94 that used the Argonne Remote Manipulator (ARM), a one of a kind device that would have been *much* more expensive to recreate than the 15K or so a Phantom runs.

  • Whether or not video and audio are a simulation does indeed get to the fine points of the definition of simulation, however I am unaware of any English word which fits the the phenomena between direct perception (by the mechanisms that are part of a human, your nervous system analogy does not work, the probe is not part of the human) and simulation... maybe translation or the phrase indirect perception.

    Incidentally, image and audio manipulation technology have shown empircally that indirectly perceived information can be unreliable. This gets down to the philosophical questions about how we can trust our perceptions, and why humans rely on analogy and logical reasoning to attempt to understand what they perceive with respects to a known baseline we like to call reality.

    The interpretation of previously imperceptable actual phenomenon through mechanical / electronic sensors into something humans can perceive always runs the risk of adding or deleting information which may skew the perception of the event. This reality of machine mediated perception must be accounted for by humans when reasoning about such phenomena.
  • I don't know why this sort of thing hasn't been commonly done for years. It's only about the first thing everybody thinks of when they hear about pushing atoms around with a microscope tip.

    Call me when somebody makes lego molecules and a tool for the microscope tip that lets you pick them up and snap them together.
  • by turbohavoc ( 79880 ) on Monday September 27, 1999 @08:33AM (#1655937)
    Its really nice to see that nanotechnology research is getting more casual, but there should be more feedback(very good song by covenant btw) from slashdotters on this kind of article. You should realize that this probably will have much more impact ont the world than if linux beat NT in some webserver test within the next couple of decades and this is definitively related to computing, this would make beowulf clusters with millions of nodes rock. But its so much more than that.

    This computer-age is nothing compared to the possibility of affecting atoms, ok i admit nanotech advocate, but if this technology leads to nanobots the way we think of them today, we could be in our favorite sci-fi movie really soon, with exceptions such as that we wont get repulsorlift and wont hear sound in space and wont have any mithoclorowhatever its called in our blood just to take starwars as an example.

    I encourage all you sci-fi lovers out there to get into the subject.

    Well, ive written far to much now =)

    STM leads to 'control of atoms', 'control of atoms' leads to nanobots, nanobots leads to 'very funky things'.
  • An amplified (or dampened) representation of a physical measurement, presented to a human sensory organ, is a form of simulation...

    The virus pushes back on the probe with X=0.0000...Klbs of force, but the human is not actually able to feel X=0.00000...Klbs of force,
    the human is not the probe, the human is not really "feeling" the virus, the human is being presented a simulated experience based on measurements taken by an instrument.
  • Speaking of sci-fi, isn't this remarkably similar (in form, even) to the nano-assembler tools used Neal Stephenson's Diamond Age? A friend of mine and I were talking about the startling nearness (time-wise) of the tech in there (you can figure it out if you check out things like historical references and people's ages)--but with things like this coming about now, it seems like Stephenson's time-scale (at least on the nanotech part) might be right on the button...
  • I guess we're just arguing semantics.
  • Why dont you build yourself tour own scanning tunneling microscope instead? I read some time ago about a 18-year old student, Todd Gustavsson if i remember it correctly that built his own STM in his fathers basement..

    Everybody could connectit to our Linux-boxen and write some GPL-software for it and everybody could spend all night building nanobots ;)
  • by Graud ( 89155 )
    I just wanted to point out that one of the presentors was named "Richard Superfine" which... is a dreadfully aweful pun.
    I guess his work was determined from the moment of birth...
  • This isn't a simulation. The force feedback is proportional to the force measurement at the tip of the microscope. The virus really does push back at the probe with X lbs of force (where X=0.00000...? ).

    The visual might be misleading, but they really are feeling viruses squish under the probe. This isn't a matter of confusing theory with reality, this is simply an amplified representation of a physical measurement: the force exerted by/on the probe.
  • by Anonymous Coward
    Are there magnetic (or other technology) ffb devices with 3 degree of freedom output and 6 DOF input? Please send references! Getting the haptics to work right over the Internet is far more of a bear than the graphics, and far more latency-sensitive. The best commercial devices I knew of were the PHANTOM ($10k) and its new Canadian competitors whose name I forget (>$20k?) - apparently they don't use cable drive on their comparable model, so it *is* a bit stiffer.

    (Tom, hudson@cs.unc.edu - we've run this over dedicated links and the Internet2 but I'd really like to get this working over the commodity Internet - not only for the coolness value, but because I could stick that "PhD" after my name, too)

  • A few years ago, when I worked at Discovery World [braintools.org] we ported the armlib [unc.edu] software from UNC to work on RT-Linux [nmt.edu] with the PHANToM [sensable.com]. Our work is available here [unc.edu] . It was pretty fun work while it lasted. Unfortunately, when I left, the project died. We did develop an extension of the UNC client/server protocol for remote manipulation - we called it TouchU-TouchME, but we never had a chance to present a paper on it. Oh well... I note on the UNC web site that someone has created a scanning-tunneling probe [staticip.cx] using Lego Mindstorms [legomindstorms.com]. This would be an ideal mate to the work that WillWare describes above.

A person with one watch knows what time it is; a person with two watches is never sure. Proverb

Working...