Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Science

MIT AI Acts Childish on Purpose 84

garibald gave us the link to an article in the electronic Telegraph about researchers at MIT who have built an interactive AI robot called "Kismet" that is as cute as any George Lucas character, and is supposed to function on the emotional level of a human two-year-old. The cuteness is not gratuitous. As the article makes clear, there is a sound, scientific reason for it. (For pictures, and more technical depth than the Telegraph story, you may want to go directly to Kismet's Web Page.)
This discussion has been archived. No new comments can be posted.

MIT AI Acts Childish on Purpose

Comments Filter:
  • by Anonymous Coward
    RFC, eh? - get some sleep.

    j/k

    Seriously, I don't know what is more unsettling:

    1. expending effort and resources to develop emotive responses in mechanical devices, or

    2. the image of programmers cooing at a robot that looks like a gremlin - "Daddy loves Kismet, does Kismet love daddy?"
  • by Anonymous Coward
    Saw the video, doesn't look too impressive.
    Seems that the algorithm is basically: See moving object, get excited (or some other expression) but look disgusted or fall asleep if it goes on for too long. Respond a little differently when human face is detected. They blather on about 'learning ' but I don't see any evidence of that either.
    Looks like a lot of hype for precious little substance.
  • You're thinking on the same track they are, except they want skin to make it act, not just look, more human. Dr. Anne Foerst [mit.edu] is a post-doc at the lab working on skin. If you fish around the MIT AI site [mit.edu], you may find a bit about the work. Cog, Kismet's "older brother," has some fake skin on his stomach. Judging from what I've read, developing skin for the robots has proved very difficult.

    I saw Dr. Foerst speak, and she is amazing. She has a joint appointment with MIT AI and Harvard Divinity. Her work on skin has a theological basis from Genesis informed by modern studies of human development. Her exegesis of Genesis, based on the work of a 14th century rabbi, was something totally new to me. She is both a theologian and a computer scientist with formal education in both fields. This article [spiritualityhealth.com] includes a brief background.

  • You should know better than this:

    1) This has *no bearing whatsoever* on her work. And if she were a man, and that man was attractive, then maybe someone would have brought this up. Also, what bearing does this man's opinions have on the field of Computer Science? I don't agree with, say, all of RMS's views, or ESR's, but they're still great coders...

    2) Didn't look at her home page, but professional photos do tend to look better. Actually, that was one of the *intelligent* replies to this troll.

    3) I'm sure 'the woman' is used to it by now. It's not nice to call people ugly, but consider the source. If she hasn't learned by now not to listen to anonymous posters talking trash, perhaps it's time.

    This brings up another point: would we be viciously defending her if she wasn't a woman? Of course not. Grow up, gender matters in society, no matter how much you try to wish it away. Hopefully in the future people will view this in a more positive light, instead of being so hung up about it. Maybe this is America's Puritan roots at work for us, but I could do without it.
  • Cynthia could make a porn video and make people pay for it to raise money. I can see why that robot gets mad when she leaves :)
  • Obviously you have never had children. Babies are wierd like that.
    --
  • So you're saying by expressing signals that appear to humans as reflective of internal emotions, Kismet is experiencing emotions?

    Spoofing is a term applied when a process transmits information to another process reflecting some status about itself that is, in fact, not reflective of the true state. For example, IP Spoofing transmits false IP Source Header information. Samba does not spoof SMB, though--it really speaks it.

    It's a bit of a grey line.

    In humans, it turns out that the mere expression of an emotional state is enough to generate some aspects of that state internally. On a personal note, this is why I spent Senior Year of High School smiling like a hyena. Acting happy created happiness--strange but true. But Kismet doesn't act out externally as an expression of internal emotional states, or vice versa--there *are* no emotions to reflect, not in the human sense of the word. It's a spoof.

    But try telling that to someone who is looking at a cute sad little doll that asks not to have its power disconnected.

    What was it that the Star Trek authors argued? That sentience required consciousness, self-awareness, and intelligence? Good values. Nice values. But the ability to effectively convey emotions (and perhaps complex concepts) effectively is probably required for any creature to be recognized as sentient. Call it a bug in the Human OS if you need to. But it's true.

    --Dan

    Once you pull the pin, Mr. Grenade is no longer your friend.
  • by Effugas ( 2378 ) on Saturday July 17, 1999 @02:48AM (#1797968) Homepage
    Wow.

    Sometimes I think computer people Just Don't Get It.

    Then I see this.

    Now, alot of us geeks might whine and moan. "Oh, it's so cute. Die Jar Jar Die!" "The future of technology lies with...Terminator Furby." Whatever.

    Outside of the cold, sterile world of gigabit routers and tab completion, a convincingly emotive device has been developed--artificially.

    The importance of this is not to be underestimated. Large chunks of government policy are designed to protect animals with emotive properties. As a classic comedy routine went, "Sometimes I think the animals just all got in a line. 'What are you?' 'I'm a seal.' 'You're cute, honk that horn, we'll make sure nobody beats you with a club. Now what are you?' 'Cow. Moo.' 'GET ON THE TRUCK! You're a baseball glove.'"

    Pet Rocks were quite the subversive satire on this maternal instinct.

    Tamagotchi began the trend, and Furbies proved the consumer attraction, but I think it's the progeny of Kismet that will whip up quite a media frenzy. Who wouldn't think that the media has been waiting to report that a machine built by man doesn't want to die? It's a pent up desire; one that will be released at the first credible moment.

    And along comes Kismet.

    The key to Kismet really is that it spoofs emotion. Think about it for a second. The MIT guys "sniffed" humans with cameras, copied the protocol stream down to the transitions from one emotive signal to another, and (most importantly) parsed enough of the incoming emotistream to generate a seemingly interactive experience.

    Shades of Eliza? Obviously. Eliza spoofed "Rogerian Psychology", where a person does nothing but ask the minimum amount of questions to keep you talking. Eliza took some aspect of human psychology and looped it on itself to create meaning to the user at the least possible computational cost.

    Give Eliza and Kismet a love child equipped text to emotive speech and speech to text, and toss in a degree of anger, rage, and fear if the robot "believes" 1) it is to be deactivated(put a video sensor near the off switch) or 2) it is ignored for excessive periods of time, and a non-zero part of the population will believe it alive and as worth protecting as a cute baby seal.

    Is it just me, or is it scary how much Hackerthink(spoof, parse, etc.) fits so many different situations? If you can talk to a single person in biotech for more than twenty mintues about their job and not realize they're utter hackers, you aren't paying attention. The same applies to psychologists. How many psych papers read like security bulletin? ("God refuses to patch. F1zRR has released Prozac 1.0 to compensate.")

    If Kismet ever goes mainstream, the psychologists are going to have a field day. The technological revolution eliminated the need for menial workers. Kismet, scarily enough, could make shallow friendships far more awkward. "We don't do much more than talk like Kismet."

    All this will go on until the media decides to flip public opinion around on its ear(thus making everybody tune in) saying "Are we nuts? This Kismet thing is NOTHING compared to humans! Deux Ex Humana!"

    That's just my thoughts. It's late. I'm tired. I have to get up in four hours. Joy.

    Send me comments. Or don't.

    Once you pull the pin, Mr. Grenade is no longer your friend.
  • I'd like to point out 3 things:

    1)I'm quite puzzled by what bearing this has on her work. If she was a man this wouldn't have been brought up. With guy's like you I wonder about the sanity of any person wanting to go into Comp Sci or the like.

    2) If you look at the pictures on her home you should be noting the poor quality of the photos and setting. She's not bad looking when she makes the effort. The same thing could be said for most of us.

    3) Have you ever considered this woman's feelings? Here she is finally getting some notice, and some Anonymous Coward starts ragging on her looks. Get a life, and a shred of human decency.

  • Share and enjoy
  • Well, don't worry about it too much. She's married. If you scroll down to the end of the MIT page and look at the publications, you'll see her last name changes sometime between 1996 and 1998. ;-)
  • Hiring a professional photographer can do that (note the credit to Sam Ogden at the beginning of the MIT page).

    That was one thing my senior year in high school taught me. See, at our school, the year book pictures for seniors would be shot by a professional photography studio while all other grade levels would be shot in mass production slaughter pens set up on campus.

    Because I was only in my second year in High School, but on track to graduate that year, I got my picture taken for both grade levels. My senior yearbook picture looks very handsome and mature. The sophomore picture looks childish. Both were of the same person (me) and taken during the same year.

    High School Prom also taught me the same thing. All of the girls who normally had "average" looks looked absolutely stunning at prom. The girls who normally looked stunning at school didn't look any better than normal and some were actually not quite as nice looking as normal -- too overdone.

    A little makeup, time in front of a mirror, and a few helpful hints from a friend who knows what they're doing can do wonders for your appearance.

    I've heard a couple of beers in the beholder can do that too... But I've never tried that one myself. ;-)
  • Hello!!! The PPro was not the main deal here. If an array of Ti DSPs isn't serious CPU throughput, I really would like to know what YOUR definition of throughput is ... :-)

    /dev
  • Perversion exists only in a cultural context. There is no cultural context for "where this is going", because automata that behave like humans have never existed before.

    Consequently, any value judgements that head-in-the-sand universal moralists might want to make as such developments unfold are just complete unmitigated rubbish.
  • > Automata that behave like humans in one way or another have existed hundreds of years

    Not in human culture, they haven't. It takes a large amount of public interplay before an idea germinates and becomes a cultural meme -- most ideas just wither and die without trace except in the minds of researchers. In the area we're discussing, there is one huge big ZERO of cultural heritage to draw upon as a basis for value judgements that have any applicability.

    > I don't see why you can't accept someone's current perception of automaton-human interaction to be valid, just because we have seen so little of it and it is not something society has got used to

    I can't accept it because apples aren't oranges, and the fact that you haven't seen many apples but have seen lots of oranges doesn't entitle you to judge apples to be bad oranges. You have to get used to apples before you can validly say whether they are good or not.

    Every single one of your arguments is laced with the desire to impose your current judgements on a new state of affairs, and there is no arguing with that, so I won't, because you won't entertain the notion that your previously acquired views might not apply.

    The closed mindsets of universal moralists are legion, and it's no surprise to find their perpetual arguing against the open-minded position of making no a-priori judgements even here. If you had been in control, there would be no free Internet because it dismisses most pre-Internet values as irrelevant. Sheesh, I'm glad you're not in control.
  • but the interaction of automata with humans is something that can and has to be judged within a human's value system as well.

    Human can and will judge anything and everything, of course, but that doesn't mean that the judgement is valid. In this case it's not valid, because there is no cultural context in the automaton+human universe against which a system of values can be built.

    Judging something out of context is only for the brain-dead, just like judging that an apple is a very poor quality orange. Not having such a context available is not a valid excuse for applying a previous inappropriate context in its place. The logical approach is to let a new context develop naturally over time, lead where it may.

    But with 95% of humans happy to follow "moral leaders" like sheep, we're in for a field day of a-priori moralistic universalism, especially when nanotech begins to change the rules built up over millenia.
  • Kismit reminds me less of a gremlin and more of Hector from that 1970's science-fiction movie with Charlton Heston!
    ---
    seumas.com
  • I'm sorry if my post was misunderstood - it was only half-serious, and was referring to the sex bot article some time ago.

    ---
  • There is no cultural context for "where this is going", because automata that behave like humans have never existed before.

    There may be no such thing as perversion in the behaviour of automata by itself, but the interaction of automata with humans is something that can and has to be judged within a human's value system as well.

    Aside from that, I don't like "head-in-the-sand" universal apologists either, who would just as gratituously apply their selfish value system to the development of weapons of mass destruction.

  • There is no cultural context for "where this is going", because automata that behave like humans have never existed before.

    Automata that behave like humans in one way or another have existed hundreds of years ago (do the names Wolfgang von Kempelen and Jacques de Vaucanson ring a bell?). The idea of social and even sexual interaction of automatons with humans is certainly not new (do you ever read Science Fiction books?). The cultural context you seem to have a fixation with develops as new things enter our culture, so it's somewhat useless as a measure of validity of a moral judgement. I don't see why you can't accept someone's current perception of automaton-human interaction to be valid, just because we have seen so little of it and it is not something society has got used to (that would seem to constitute a cultural context for you). Would you say that bestiality may only be considered immoral ("perverted") because it has been known for thousands of years? Arguing that moral judgement of interaction of humans with automatons (assuming for the sake of argument that it is relatively new, which it isn't) is not valid because someone improved the technical possibilities for it recently is like arguing that child pornography on the Internet can't be judged because it has only been around for a few years (and the adaptation of the "cultural context" of child abuse, paedophilia etc. which has been in existence for thousands of years as well, would be invalid according to your argumentation).

    But with 95% of humans happy to follow "moral leaders" like sheep, we're in for a field day of a-priori moralistic universalism, ...

    This would be a valid argument if these new developments were in no way part of our culture/universe and therefore inaccessible for our current value systems. Feel free to argue that this is the case, I just hope that your head is stuck deep enough in sand.

    Note that personally, I'm not calling interaction with automatons anything - I am just apalled by your narrow-mindedness and the way you criticise people with other value systems while apparently trying to force the idea that science must be free from the burden of ethics down everyone's throat.


  • Note that personally, I'm not calling interaction with automatons anything - I am just apalled by your narrow-mindedness and the way you criticise people with other value systems while apparently trying to force the idea that science must be free from the burden of ethics down everyone's throat.


    That's just rich. "Who are you to judge, perhaps my booted foot is comfortable on your neck. This iron fist gets awful cramped, it has to be flexed and pounded on people a few times. Shoving these ridiculous notions that I shouldn't control everything according to my principles of what I believe is best for you ... silly children, I know what right and wrong is, and you're better off and you darn well better not be forcing these silly notions on others that there's other ways to think."

    It's unfathomable how moralists can get away with claiming that their opposition forces their viewpoint of free inquiry onto others as some sort of oppressive yoke.
  • Why... yes. Sums up the definition of hypocrisy quite nicely, even if that word does get slung around far too often.
  • > The pictures of her with the robot caught my eye at first...

    I don't think you realize that a thread of comments on someone's appearance is not appropriate in this context. Having to put up with stuff like that can get really old, and discourage women from participating in the "community". In case you hadn't yet realized it, Mrs. Breazeal has damned impressive techie credentials. (And she probably has better aesthetic sense than you do, as well.)

  • I'm afraid that you weren't thinking very clearly when you posted, but I'm sure that you will find it helpful to closely re-read and thoughtfully contemplate the posts in question (especially #53 [slashdot.org] and #60 [slashdot.org]), consulting a dictionary if you encounter any unfamiliar terms.

  • I don't think it actually detects a toy. It seems to just detect motion, which is quite trivial.

    In June I visited the MIT AI Lab and had a chance to talk with Ms. Breazeal and Brian Scassellati, another doctoral student working on Kismet and Cog.

    Kismet detects both faces and motion, and reacts differently to each. Thus waving a toy with a face, a teddy bear, will cause a different reaction then waving something without a face, say a slinky.

    Steve M


  • I wonder if they have implemented a 'familiarity' algorithm in the robot,...

    I visited the MIT AI Lab in early June (I took a one week course -- it was fun). I asked Cynthia this question, and the answer is no, the robot does not remember. It identifies facial features and reacts according.

    I believe she said it was on the to do list.

    Steve M
  • It kind of reminds me of the old adage that the more you learn about others, the more you learn about yourself. In this case, Kismet attempts to assign quantitative values to human's emotional qualities and react to them appropriately.

    I am personal surprised that we have not made more inroads into this field by now. It seems that converse has happened: we humans have adapted to computers more than we have made them adapt to us. I've written about this before on my website. [geocities.com]

    On the other hand, if computers displayed the emotional complexity of humans, we would not be able to trust them as much. Such is the paradox of reinventing ourselves.

    - Michael
  • Like in mimicry. I came across a factoid about this bird, apparently would feed the carp in the pond. Apparently the gaping mouth of the carp was enough to trigger the 'mothering instinct' of the bird...
  • Oh fucking give me a break. Academia is, if you haven't noticed, a male dominated institution--meaning we weed the women out when they're younger. Moreover, who the fuck are you to tell us what is and is not appropriate discourse on slashdot? Are you honestly saying that if a Cindy Crawford lookalike came into your lab, the first thing you'd notice would be her curricula vitae (no, her OTHER vitae, you letches)? Yeah, that's a real nice touch at the end, adding in your analysis of her aesthetic sense. What happened to her vaunted credentials, Einstein? Maybe they melted away in that hypocritical moment, as you waxed poetic about how she lights up a room.

    Obviously you DON'T work in academia, since any one who does know that it's ALWAYS important to package grant proposals in the best light, technical merits be damned.
  • This makes me worry that our first consumer-friendly robots are going to be like Marvin, the Paranoid Android.
  • That Cynthia is quite a babe. I'd like to have her try to make ME smile...
    ---
    Put Hemos through English 101!
  • Well, it's a nice toy, I guess. And the emotional displays are fairly realistic, considering that it is a bunch of wiring and metal rods.

    I watched one of the videos, but I as well don't see much, if any, "learning" going on, on the robot's part. If anything, the preprogrammed behavior of Kismet is training Cynthia not to wave the toys too wildly in its face.

    Kismet is programmed to manipulate. If you neglect it, it's sad. If you pay it enough attention, it's happy. If you overstimulate it, it gets cranky. I had a Tamagotchi that did the same.

    If the point of this is to gauge *human* responses to Kismet's algorithms, then it's doing fine, but it's a darn expensive way to be doing it.

  • It reminds me more of a Mogwai robot that is naked (C3PO:why are saying that I am naked?)
  • As much as everyone is fawning over Kismet, Dr. Brooks as much admits that it is a hack when he lets slip that it exploits "human" programming. Much more impressive (and operating at a 5-year-old level, not 2-year-old) is CYC, which has been being built in Austin Texas for about 10 years now by Dr. Douglas Lenat. See Cyc [cyc.com] here. It is based on Dr. Lenat's work going back to the 80's to build a computer that can learn and reason. Dr. Lenat previously had built AM and Eurisko. All /.ers should have at least some familiarity with his work.
  • Not head-in-the-sand.

    Head on the tabletop.

  • Did you notice on the Kismet page that you can participate in the experiment [spiritualityhealth.com] by providing feedback as to what various facial expressions mean?

    Personally, I'm going to arrange some Kismet faces to make a "How Do You Feel Today?" montage...


  • great idea, but DAMN! that thing's ugly. Looks like a cross between a furby and Short Curcuit's Johnny 5.
  • Looks like pretty antiquated hardware on the PC side, I saw a PPro processor and the interface uses ISA slots. On the movs, the reactions looked lagged, coulda been the mov format though. You need some serious processing power to handle true fuzzy logic, maybe a hardware upgrade is in order? :)

    I wonder if they have implemented a 'familiarity' algorithm in the robot, one that uses recognition technology to "remember" facial and structural contexts, so that Cynthia could conceivably be perceived as a 'mother figure', whereas another unfamiliar person in the field of view could possibly cause a negative reaction without any overstimulation. Maybe that's next.

    I remember seeing on some cheesy PC tv show there were some programs floating about that used facial recognition with a camera to verify identity, maybe this could be incorporated in the experiment to make the robot's reactions more 'human'.

    doc.
  • Yep, Cynthia could probably make a furby smile! :)

    They need to check kismet with Janet Reno, to see if he's really just reacting to faces!
  • I believe that nature evolved emotions before consiousness, and that there have been (and almost certainly still are) animals that have emotions but little if any concious experience of them. Remember that the evolutionary/adaptive power of an emotion is that it makes you react well in a given situation (e.g. fear -> fight/flight, etc). Kismet is like a primitive animal that has evolved to have emotions that benefit him, but has not evolved the brain architecture necessary for conciousness and hence the introspective awareness of his own emotions.

    Kismet is not spoofing emotion - he is exhibiting emotion through his resultant behavior. There is nothing fake about this, or fundamentally different to the way emotions work in ourselves. If a human observer anthromorphises Kismet and assumes he is experiencing anything, then that is a human shortcoming, and not a reflection on the reality of Kismet's emotions (internal state).

    Ben
  • I disagree that emotion implies experience (i.e. concsiousnes). From www.dictionary.com : "disturbance or agitation of mind caused by a specific exciting cause and manifested by some sensible effect on the body". If an animal enounters an unusual and/or rapidly changing stimulus (say I jump out in front of you!), then this will cause adrenaline to be produced, which will have a certain effect on it's mind & body and short term behaviour. In another situation (say the sensations of grooming), endorphins might be released which will have the effect of temporarily reducing the animals aggression level, and making it more apathetic.

    The other poster in this thread brought up the example of a bird feeding a gaping mouth carp due to a hard-wired respose. My Canadian fiancee saw a moose in their backyard trying to mate with a PLASTIC deer! Things like this make it hard to deny that much behaviour is simply a hard wired responce to certain stimulii. I guess you could make the distinction between emotions (biochemical) and hard-wired neural behaviours, but obviously a robot like Kismet could easily parallel either via global state that is modified by stimuli and effects his behavior (as they have implemented), or via more specific behaviours such as wide eyed and slack jawed as a response to girls like Cynthia (not yet implemented).

    Incidently, when I said that emotions make animals "react well", I didn't mean that it makes them react optimally all the time, just that it makes them react in ways that are statistically good for the species (i.e. that statisticaly have survival benefit).

    FWIW, my own hobbyist "artificial animal" research is based on evolution, neurology and child development, rather than AI. Cog/Kismet are more along the right direction that projects like Cyc, but nonetheless still rather misguided..

    Ben
  • Emotions are a function of one of the oldest parts of the brain, and almost certainly are hard-wired evolutionary adaptive "knee-jerk" responses to external stimuli.

    Would it be any more fake if Kismet was programmed to find "baby faces" (big eyes, rounded face, etc) cute, than humans finding Kismet cute?

    This isn't spoofing. This is the way emotions work.

    Ben
  • I suppose that means it won't be long before they train it to post to Slashdot.
  • In the area we're discussing, there is one huge big ZERO of cultural heritage to draw upon as a basis for value judgements that have any applicability.

    I want to make sure I understand your argument: you have a problem with saying that human intercourse with a robot is perverted because we have no experience to use to make this judgement. The problem with this argument is that, although we have never had robots capable of sexually stimulating humans, we have plenty of experience with sexual acts other than human+human. Any of these acts could be called perverted, but the word "perverted" is a very subjective term.

    We have invented an endless list of mechanical devices, the sole purpose of which is sexual stimulation. Porn shops litter the highway selling these devices as well as video tapes of sexual acts. We have an online army of porn peddlers that, combined, probably out gross many countries. You can even buy a life sized doll made out of silicon that has an artificial internal skeleton (Check it out) [realdoll.com]. We have personally witnessed the way in which technology has evolved sexuality and now you say that we can not use that experience to judge what seems like the next step in its evolution, particulartly one that seems eminent?

    As a species we do this all the time: use past experience to predict and judge the future. We are doing it right now in relation to the cloning debate. If we can not use history as a guide for the future, then we are lost. Note that I said use history as a guide. No one is saying sex with robots is perverted because sex with animals is perverted. We are using reason, logic, history, and personal experience to make a judgement on a scenario. The fact that we have never actually been in that scenario does not make the judgement wrong, it simply means we have to be more careful in making it.


    -
  • Am I the only one who thought Kismet looked like he could be Crow's ancestor (of mst3k fame :)
  • The hard part is still how the robot learns and applies its knowlege. Are they making robots or the next generation of Furbys? ;)
  • Good point. What I was *trying* to say is "Are they building useful robot learning technology -- or toys?"
  • Which context are you talking about? The original comment was about the fact that someone found Ms. Breazeal attractive. In that context, talking about someone's appearance is perfectly suitable. It isn't as if he said "her research is crap because i don't think she's pretty".

  • Primitive? It's based on a parallel array of signal processors. The PentiumPro is probably little more than a glorified serial chip in this setup and as for ISA... a lot of scientific HW is still are designed around it.
    Guess they coulda put a PIII in Mr Furby here but then again, maybe they wanted to be able to run their robot on less than 50 W of power...
    Antiquated.. hah.
  • You're dissing BEAM? You're a dork. Just because it doesn't have a series of 6 parallel Alphas linked by a superconducting 10GHz system bus and run a mini-OS containing 16.5 billion instructions doesn't mean that the system isn't elegant and functional. Marks' stuff is nothing short of incredible.

    Tilden's assertions about the number of computing units needed to locomote and adapt to the environment seem right on the mark. Have you ever seen or, better yet, built one of these things? It's awesome. I spent 3 years as a hobbyist trying to get half of the robustness that a Tilden design displays.. My BEAM droid wandered my house unattended for two weeks, charging itself when needed and avoiding death by any number of means.. I even lost it for a couple hours once. It had made its way under my bed.

    Jeezuz, you break off one their legs and they can keep on walking with a modified gait! Add in some simple (or not so simple) sensory response to the core neural cascade and you have a VERY insect like system.. far more "intelligent" than even a Linux machine.

    Brooks and Tilden are probably both right.. if it was a question of programming or processing power, we should HAVE at least a marginally intelligent robot by now but it's NOT. Intelligence is shaping up to be the result of layers upon layers of communication, potentiation, and redundacy rather than a single processing unit committed to finding the "best" solution to any given problem. Brooks is going after intelligence and Tilden is going after animal behavior. Someday their love child is going to be reading this thread in the late 90's web archives and laugh its ass off.

    You have to know this.. get a disk block out of whack on your super-fast UNIX box and what happens? At least.. error and at worst.. crash. The box will ask you what to do. It's dumb. It's a glorified scientific calculator. And so is mine. But Otis (my dearly departed BEAM droid) was "smart".
  • You have any idea how much work it is to detect a both a human face AND a toy face? You're missing the point.. maybe this wasn't hard. Maybe the author whipped this thing up in a couple months.. in that case, it's completely and utterly amazing that it acts the way that it does.
  • Yeah, that was sorta funny...

    (If you're an antisocial retard who's never even talked to a woman)

    It's going to be a long, frustrating life if you can never relate to women any better than that. Have fun spending your Saturday nights DIH, my boy.
  • Linus Torvalds looks like a dork. I'd figured he was at least some skanky, pierced cyberpunk type based on all the hype he's generated but he looks like a home economics teacher. I guess that invalidates his life's work, too...
  • I read both the article and the MIT page, and I still don't have a clear idea of what exactly is so revolutionary/interesting about this thing. It says that it acts "cute" when it's receiving insufficient "attention"...did it learn what "facial expressions" got it more "attention" itself, or is it preprogrammed to assume a specific facial configuration when it's "lonely"?

    Unless it learned how to be "cute" itself, it's just an AIBO. Why would MIT be pushing this when they've almost certainly got more significant projects running?

  • The lines I quoted below are funny yet profound.
    I don't think we understand our own emotions let alone how to emulate them. Ill go back to the story and see if they have psychology types working with them. I won't eat any commercial meat because of the methods used to raise and harvest it. The animals probably are not healthy to be eating but if they are..they are tortured long before they get the shot to the head. The transportation alone is inhumane. Yet..I hunt and will eat what I kill
    myself or by someone I know. I know the creature wasn't tortured before it was killed and can be somewhat assured of its health and condition. Before you aim the flame throwers..You are all invited anytime after Jan 1 2000 for some FRESHMEAT on the table...Ill make it look like it came from the grocery store.
    below is quoted//////////////////////////
    The importance of this is not to be underestimated. Large chunks of government
    policy are designed to protect animals with emotive properties. As a classic comedy
    routine went, "Sometimes I think the animals just all got in a line. 'What are you?' 'I'm
    a seal.' 'You're cute, honk that horn, we'll make sure nobody beats you with a club.
    Now what are you?' 'Cow. Moo.' 'GET ON THE TRUCK! You're a baseball
    glove.'"
  • I think its a good excuse - I didn't talk until I was three - I was talking before I was walking. (I was a slow learner.)

    My parents tell me I didn't say a word until I was three, and then I started with full sentences. Before that I merely imitated the sounds of road vehicles. :)

    But then again, I can't see a digital talking ape either. :)
  • No I should run aroud hiding my identity because I'm afraid that my nick name will be associated with me and everyone will be out to get me.

    ACs.
  • Yeah really. The AI looks to be even more simple than what is in that Jurassic Park: Lost World for the pc. In that game, dinosaurs balance danger to self, hunger, line-of-sight, territory, and probably some others that I forgot. Not that even that algorythm can even come close to any animal behavior -- even those operating purely on instinct.
  • They are making robots. Those Furbies are examples of robots. Did you think (mistakenly) that robots are defined as sentient beings?
  • > She's not hot

    1) You are in no position to tell that.
    2) You're wrong, she is. (They all are, except
    mom :-])

    > Don't you hate it when you see a girl from
    > behind and you think she's beautiful, only to
    > have her turn around and reveal that she's
    > ugly?

    You are calling her ugly? Well that is impolite,
    bordering on rudeness.

    And that does bother me, because there are not
    that many female techies around, so we should
    treat them politely. Traitor you are.

    -Cic

    P.S.: This never happens to me. When they turn
    around, they always are in fact beautiful.
    Maybe you don't manage to make them smile at you?
  • Of course, I'd be prepared to fight that poster
    in a duel of Quake II to defend the ladys honour.

    If only he'd given his name...

    Of course, that would be more knightly than geeky, and perhaps the lady would take more
    satisfaction out of it if she'd frag him herself.

    (Repeatedly. With a blaster. From behind. With
    eyes shut and one hand bound to the back :-])

    But still, the offer stands. (En garde!)

I THINK THEY SHOULD CONTINUE the policy of not giving a Nobel Prize for paneling. -- Jack Handley, The New Mexican, 1988.

Working...