A Robot Learns To Fly 289
jerkychew writes: "For those of you that read my last post about the robot escaping its captors, there's more news regarding robots and AI. According to this Reuters article, scientists in Sweden created a robot that essentially 'learned to fly' in just three hours. The robot had no preprogrammed instructions on how to achieve lift, it had to deduce everything through trial and error. Very interesting stuff."
Oh great . . . (Score:4, Funny)
Not only do we have to watch out for bird crap raining down on us, we now have robot excrement to worry about as well.
Re:Oh great . . . (Score:2)
With a few genetic algorithms, perhaps it can be trained to make realistic bird-poop also.
Come to think about it, I vaguely remember a story about a robot garden slug-eater that could actually digest slugs for feul. I am sure it had "byproducts". A little merger here, and walah, your dream (or nightmare) comes true thanks to modern science.
Interesting, but... (Score:1)
It just seems to me like AI through logical progression, which I'd be tempted to not call AI...
Re:Interesting, but... (Score:3, Insightful)
Sounds like a neural net with real-time recalibration to me..
Re:Interesting, but... (Score:2)
The "pairing up new combinations" could be an anthromorphism.
Re:Interesting, but... (Score:2, Informative)
Somehow... (Score:5, Funny)
The fact that it "cheats" somehow restores my faith in robotkind....
-ajb
Re:Somehow... (Score:2, Funny)
Well.. (Score:5, Funny)
Well. Assuming the birds were TRYING to fly, knew what lift was, and already had the equipment (i.e. wings) to achieve this.
This brings an image of stupid birds sitting around flapping randomly thinking "FUCK - I'm SURE this should fucking WORK! - Bastards - OOps, I just fell over to the left - does that mean my right wing was flapped right???? - Hey - John! WHAT DID I DO THEN????"
Re:Well.. (Score:5, Funny)
Do you think it would have learned faster if they'd taken it up to the roof, and thrown it off?
"Hmm...my sensors indicate that I am falling at a rapid rate. Maybe I ought to do something about that. I'll try flapping this thing. Nope. How about together..that seems to be wor...."
-ajb
Re:Well.. (Score:3, Funny)
Re:Well.. (Score:2)
Chicken Run (Score:2)
Rocky: You see, flying takes three things: Hard work, perseverance and... hard work.
Fowler: You said "hard work" twice!
Rocky: That's because it takes twice as much work as perseverance.
Re:Well.. (Score:2)
I guess that the "Special Creation" theories no longer fly (ah-thankyou).
Seriously... it took _humans_ a pretty long time to figure out flight, heck, even gravity (and for some reason we want AI to be like us?).
While I'm amazed at anything that learns, which isn't carbon based, I wouldn't start comparing this to actual life. When robots actually take over, smelt metals for more robots and develop interstellar travel you'll get a wow from me.
(BTW if this sort of thing scares you remember that the commies want to purify your precious bodily fluids!)
Re:Well.. (Score:3, Insightful)
If the scientists threw together a bunch of spare parts, and watched as a robot magically constructs itself, decides a useful thing to do would be learning to fly, and then takes off--well that could be compared to millions of years of evolution. And you know what? It'd never happen. Not without some "divine" intervention on the part of the scientists.
Re:Well.. (Score:2)
Re:Well.. (Score:2)
I guess that the "Special Creation" theories no longer fly (ah-thankyou).
Re:Well.. (Score:4, Insightful)
However, the robot could not actually fly because it was too heavy for its electrical motor.
This thing didn't even learn to fly, it just flapped it's wings. And what kind of evolution did it go through, it didn't pass on different genetic information until a new trait was passed on forming a new race, it just flapped it wings.
Re:Well.. (Score:4, Insightful)
Re:Well.. (Score:3, Insightful)
Re:Well.. (Score:2, Informative)
Hmm (Score:2, Funny)
One small step for robot, one giant leap for robotkind
very interesting (Score:4, Interesting)
Re:very interesting (Score:5, Funny)
Re:very interesting (Score:2, Interesting)
Actually, I think research of this sort has gone a lot further in the simulated environment than these swedes have done. The different thing about this research is that they have done it with an object in the physical world. This should please those who distrust simulation, but for the average
Re:very interesting (Score:2)
Re:very interesting (Score:2, Interesting)
Re:very interesting (Score:2)
I'm watching these RobotWars shows sometimes, and I'm always imagining something similar but with cooler weapons and AI instead. I am pretty sure it will come as soon as it's possible...
oh well.. (Score:2)
Re:oh well.. (Score:2)
Cool (Score:3, Funny)
I picture a robot aerobics class.. heh. But if anybody asks, I picture a robot boot camp.
Re:Cool (Score:3, Informative)
Read _The Practice Effect_ by David Brin. Sci-Fi. It's not a deep read, but entertaining. In an alternate universe where physics are different, the more you do something, the better you get at it. For instance, if you tie a stone to the tip of a stick and pound it against a tree, eventually the stick-stone will turn into a diamond-tipped axe.
It's a stretch, yes, but it's a fun read. You'll love it when the robot (from our world) reappears at the end of the book, after having 'practiced' what it was told to do, unseen, for most of the story.
Sensationalism (Score:5, Insightful)
Ridiculous to compare prebuilt robot to evolution from some dinosaur to flying dinosaur (also known as bird). This really is tabloid headlining at it's purest.
And the robot didn't even fly, just generated some lift!
It's like saying humans can fly, when they generate 1N lift flapping their arms.
But it's great to see how selflearning robots and programs will start evolving now. I quess pretty soon computers and robots will be able to evolve faster on their own than when developed by humans.
Re:Sensationalism (Score:2)
Krister Wolff and Peter Nordin of Chalmers University of Technology built a robot with wings and then gave it random instructions through a computer at the rate of 20 per second.
So they built a robot, gave it sensors, then said, which works best, this? how about this?how about this?how about this?how about this?how about this?how about this?how about this?how about this?how about this?how about this?how about this?how about this?how about this?how about this?how about this?how about this?how about this?how about this?how about this?how about this?how about this?how about this?how about this?how about this?
Big whoop. So they built a robot that can do a bubble sort.
Re:Sensationalism (Score:2)
Learning to fly? (Score:1)
Execute instruction
Lift higher than others?
YES - Remember this instruction
NO - Get next instruction
Repeat until no instructions
keep repeating successful instruction
Seems pretty basic to me and hardly learning, just a new spin on analysing the effeciency of algorithms.
Re:Learning to fly? (Score:3, Insightful)
Well, analysing efficiency of algorithms and discarding the bad ones seem pretty much like "learning" to me.
Sure, humans aren't built to work efficiently with algorithms like robots do, but we learn from mistakes which one could call "poor algorithms with an undesired result". Humans don't exactly choose randomly between ways to do things - we perform things the way we suceeded in earlier.
Re:Learning to fly? (Score:2)
Unfortunately, some do.
Re:Learning to fly? (Score:2)
This is how it starts (Score:2, Funny)
You're all doomed, I warned you!
I'll just get to packing my stuff, moving to a remote cabin in Montana and keeping a close eye on my refridgerator (I know it hates me, it keeps melting my ice cream).
article just bloats (Score:2)
Well, at least evolution succeeded in making birds that weren't too heavy for their own wings...
Seems to me that this project was not really as exciting as they would like us to believe...
Re:article just bloats (Score:2)
However, the robot could not actually fly because it was too heavy for its electrical motor.
It merely succeeded in figuring out the best series of motions to get maximum lift. In any case, that's all the robot had to do - try to fly. It didn't have to worry about predator avoidance, finding food, defending a territory, or mating.
Re: article just bloats (Score:2)
It's even worse. It didn't even try to learn to fly. It tried to get to the best combination of movements that its creators thought was "flying".
See the difference? A real evolution would consist of a robot that was actually light enough to be able to fly. And then it could measure its own success by how much lift it got.
Since the different instruction lists that were fed and tested inside this robot weren't checked against "How high will it let it fly" but only against "How close does it look to what we scientists know that flying should look like", this experiment is rather worthless.
Maybe the evolutional algorithm found out a better way to fly with its wings than the standard way birds do. And it was just thrown away by the control program because it thought: "This doesn't look to me like flying is supposed to look." because it wasn't tested in real-life with a robot corpus that could have proved that this new movement combinations actually work.
Really a shame. Could have made a really interesting study.
Re: article just bloats (Score:2)
So how about we just simulate parts of it for now, ok?
Re:article just bloats (Score:2, Funny)
1) doing everything it can to lose weight so as to be able to do what society is asking of it.
2) after long anorexic periods jumping off a bridge, inventing the concept of gliding half way, slowly setting down on the water and then taking digital anti-depressants until it shuts down.
Re:article just bloats (Score:2)
Re:article just bloats (Score:3, Funny)
You can consider the poor bot some kind of turkey
Re:article just bloats (Score:2)
Apart from Penguins, Emus, Ostriches, Kakapos, Cassowaries, Kiwis, etc...
Impressive, but... (Score:5, Insightful)
The robot was physically equipped with all it needed to 'fly'; it was also equipped with all the wires in the right places. The fundamental difference between robots and living organisms is in the thinking: a newborn bird has to forge new synapses in its brain; this robot was designed with the purpose of 'learning to fly', so was given all the appropriate connections; it is just a matter of working out what sequence of events is required. Robots inherently have some form of co-ordination; birds, on the other hand, just like any other animal, have to develop such skills.
Re:Impressive, but... (Score:3, Informative)
I don't think you are quite correct here. Evolution has done wonders with the brain and pre-wired some instructions. For instance birds do learn very quickly how not to crash! And there must be some pre-wiring describing how to use air currents for instance.
Re:Impressive, but... (Score:2)
Many "lower" animals are born with such knowledge required for their survival.
Re:Impressive, but... (Score:2)
Re:Impressive, but... (Score:2)
Don't forget that when birds hatch they still have a fair amount of physical development (muscle strength, bone strength, flight feathers) before they are physically capable of flying.
Re:Impressive, but... (Score:2)
They do need to learn to fine tune their flying however, every birds body is going to be a bit different fo course.
Either important or a fancy press release (Score:2)
Re:Either important or a fancy press release (Score:2)
I found this myself, Krister Wolff was the other guy mentioned in the Reuter's article, here's his homepage [chalmers.se]. It contains some interesting publications, like the one on Sensing and Direction in Locomotion Learning with a Random Morphology Robot [chalmers.se]. Worth reading!!
Imagine the Wright Brothers... (Score:5, Funny)
That would have been something to see.
The robot stands proudly on it's wings, and tells the scientists "Look at me, I generated maximum lift, and I don't have to exert any force at all. Oh, and from here, I can see the mouse is climbing over walls to get to the cheese without going through the maze. You humans are so stupid!"
Work on the cheating algorithm (Score:2, Funny)
Now the cheating - that is the interesting part. When they have the algorithm down so that the bot hobbles out the door and purchases a ticket at the airport, then they will have a winner.
flying robots you say? (Score:2, Informative)
Finally! (Score:5, Funny)
But after three hours the robot discovered a flapping technique
However, the robot could not actually fly because it was too heavy for its electrical motor.
"There's only so much that evolution can do," Bentley said.
Finally we understand the dodo's place in evolution.
Re:Finally! (Score:2)
Ph34r the Tae-Kwon-Dodo!
You don't need hardware to try this at home... (Score:5, Interesting)
Define a very simple stack-based language. The stack only holds boolean values, and when empty pops and endless supply of "false" values and when full discards pushes. Choose some control flow opcodes:
NOP, SKIP (pop, if true, skip ahead a fixed amount), REPEAT (pop, if true, skip back a fxied amount), NOT, RESET (clear stack, back to beginning)
and some opcodes related to your environment (mine was a rectangular arena):
GO (try to move forward one step, push boolean success), TURN (90 degrees clockwise), LOOK (push boolean "do I see food ahead?"), EAT (try to eat, push boolean success)
Pick a stack size (this has interesting consequences, as some of my organisms learned to count by filling the stack with TRUE values and consuming them until they hit the endless supply of FALSE when empty) and a code size. Force all organisms to end in your RESET op. Generate them randomly and run them in your simulator (I did 20-50 at once letting each one run a few hundred instructions in a row). Evaluate fitness (in my case, how well fed they were) and breed them. You can combine the functions in lots of ways. Randomly choose opcodes (or groups of opcodes) from each, possibly with reordering or shifting. Introduce some mutations.
Once you get something interesting, try to figure out how it works. This can be the hardest part -- my description above produced many variations that were only 8-10 instructions long before an unavoidable RESET opcode, and they could search a grid with obstacles for food!
Re:You don't need hardware to try this at home... (Score:2, Interesting)
Re:You don't need hardware to try this at home... (Score:2)
This program is sort of a hack, as I wrote it quickly. It's not neat OO code, but it does seem to work.
Crap. I can't even post it on Slashdot due to the lame lameness filter which the trolls seem to have no problem circumventing.
I couldn't post it on Slashdot, so I posted it in the Monastery. [perlmonks.org]
Re:Bogus publicity stunt (Score:2)
These guys probably had a lot of fun doing what they did, and I assume they are not up to MIT standards (not many are). Don't blame them for whatever a reporter might blurb out.
Just as impressive... (Score:2, Interesting)
MAIN
{
target = 72;
do
{
guess = rand();
}
while guess target;
print "GOT IT!"
}
NEWS HEADLINE:
Artificial Intelligence researcher creates computer program that comes up with the number 72.
Maximum lift != flying (Score:2, Interesting)
Well, I'm impressed even if you aren't (Score:2)
For example:
StarBot: How long before it learns to make a grande latte half-skim/half 30 weight?
Bouncebot: How long before it learns not to turn its back on the loud drunk in the corner?
Lobot: How long before it decides it really doesn't want to learn anything, just sit around and smile.
Congressbot: how long before it learns that working tirelessly for your constituency is its own reward, whereas lying for assorted interest groups is money in the bank? Note: This may be a special case of the Lobot.
But the real question is... (Score:2)
Robot Learns To Fly/Escape/etc. (Score:5, Funny)
Genetic Algorithms? Anybody? (Score:5, Informative)
What the researchers did was to build a robot that had wings and motors for manipulating them. These could be controlled by a computer. But instead of writing an explicit program telling the robot how to fly, they got the robot to learn how to fly. They did this using some sort of Genetic Algorithm [cmu.edu].
Basically, what a GA does is to generate a large population of possible solutions to the problem, then evaluate how good each one is (i.e. measure the lift each one creates in this example) and then to breed good solutions to create successive generations of possible solutions which are (hopefully) better than the previous generations.
Then, once some criterion is met (for example, once the average fitness of your population doesn't change much for several generations), you then select the best solution found so far as being your answer.
In mathematical terms, GAs are stochastic methods of optimising a function; they are typically used when solving the problem using an analytic method would be problematic (i.e. it would take too long etc.).
So it's not really surprising the robot learned to 'fly' -- the researchers just managed to find an optimal sequence of instructions to send to the wings.
The next step would be to get a robot to learn how to hover without the aid of the stabilising poles; then fly from one location to the other; then fly in a straight line in the presence of varying wind etc.
What the research does do is to lend credence to the argument that insects and birds could have evolved, rather than having been 'designed' by some sort of a God.
Re:Genetic Algorithms? Anybody? (Score:2, Insightful)
And if the robot were to have built wings from available parts, that wouldn't count, as even we humans learned to assist the limitations of your body.
Trial and error is an excellent learning tool, look at how much toddlers rely on it... I cry I get food, etc.
Re:Genetic Algorithms? Anybody? (Score:2)
This is impressive because it demonstrates a particularly successful marriage between a design and a fitness function for the design.
Finally, re: evolution vs. intelligent design; who specified the fitness function
Genetic Programming instead.. (Score:2)
There's a difference, GA is much easier to program than GP and is usually much faster. Example of a good candidate problem for to use GA to solve would be the travelling salesman problem, while GP would find a method to solve a problem.
While fundamentally pretty similar, GP is slightly more complicated. You have to deal with issues such as program over growth during evolutions, which doesn't happen in GA.
http://www.genetic-programming.org/ [genetic-programming.org] is a good source to learn more about GP.
I took a class on GA and GP while in college, very interesting stuff.
Other search algorithms (Score:2)
In biological evolution, the figure of merit is approximately reproductive success. Evolution works by favoring genes that get themselves copied a lot.
Exhaustive search would be the obvious way to find a true global maximum in this sort of problem, but often the cost-per-tuple of evaluating the fitness function is non-trivial. In cryptographic key searches, the fitness function is a Dirac impulse which is one at the correct key and zero everywhere else, so near-misses don't help you to find the global maximum. (This isn't strictly true; many crypto algorithms have classes of weak keys, but that's a diversion for another time.)
Another partial search algorithm, aside from GAs, is simulated annealing. Yet another is the backpropogation algorithm used to find good sets of weights for neural nets.
As the crypto example illustrates, these partial search algorithms rely on gradients of the fitness function, where near-misses have higher fitnesses than wild-ass misses. One of the things one does when using a GA as a design tool is therefore to try to select fitness functions with gently sloping gradients throughout the space of solution tuples.
Re:Other search algorithms (Score:2)
It's a Moore's Law thing. Some computations that were infeasible ten years ago are feasible today, but not all, and the range of which ones have become feasible is larger for the NSA than it is for your high school computer club. Sometimes you can design your hardware or software to exploit unique patterns in the problem domain, like the EFF DES cracker from a few years back. They got a lot of mileage by designing problem-specific chips that wouldn't have been available with general-purpose processors.
Re:Other search algorithms (Score:2)
But in some cases, these are legitimately quite difficult problems. Remember you're searching a many-dimensional space (R^n for n large) for the point that maximizes some function. The hassle is when evaluating the function is itself an expensive thing. Every tuple might represent a costly experiment.
For instance, suppose you're designing airplane wings. You've come up with a generalized wing design that has twelve parameters. You want the 12-tuple that gives the best wing. In the days before computers, you would have run experiments in a wind tunnel. But with a peak-Cold-War black budget, you couldn't have made 10^12 wing prototypes and tested them all. Nowadays we can skip the wind tunnel and simulate the aerodynamics of a wing on a computer, but it's still a non-trivial effort. If each 12-tuple involves one CPU-hour, those 10^12 experiments will still take 114 years on your million-processor parallel computer.
Evolution is a partial search algorithm for the genome that, within its environment, reproduces the most rapidly. Every individual genome is an experiment that involves the entire lifespan of at least one organism. If you're talking about giant Sequioa trees with multi-century lifespans, that's a slow process no matter what you do.
Throwing a lot of computation at these kinds of things is a good idea. I laud John Koza's effort [genetic-programming.com] in that direction. But even if we use the masses of Jupiter and Saturn to build networked petahertz nanocomputers, there will still be interesting problems for which exhaustive search remains infeasible.
i have a screensaver that's learning to walk... (Score:2)
this is especially cool, in that they've not only done this in a simulation, but with a real nuts and bolts 'bot. how easy it seems to me now to ship out robots with very little programming, but a quick learning curve. the owner puts the 'bot in its home, punches in a few things it would like the bot to do, and lets it explore a little. after some training, it's perfectly suited to its new job and new environment...
oh yeah. i grabbed a screensaver a while back from this guy [spiderland.org] that simulates a simple creature learning to walk. pretty spiffy, and you don't have to worry about it ambling off to the parking lot...
New thinking, no AI, is important here... (Score:2)
A system that evaluates the effectiveness of a solution and refines it from there, that does so using real-world fitness ends up including factors that an engineer wouldn't have thought of. Perhaps because these variables are unknown, or seeminly insignifigant.
A while back, you'll remember,
Man, I love this stuff.
Re: (Score:2)
Re:New thinking, no AI, is important here... (Score:2)
Yes, but look what I can do! (Score:2)
Birdwatches around the world were SHOCKED at this finding.
However, the robot could not actually fly because it was too heavy for its electrical motor.
"There's only so much that evolution can do," Bentley said.
Using these definitions, this robot's achievement pales in comparison to my evolutionary process which lead my body to the ingest a jelly donut covered in sprinkles this morning. Since I never had one of those before, the only explanation is that I naturally evolved to the point where I wanted to attempt the ingestion of such an object.
Seriously, guys, this is nothing but cheap heat for a worthless techie. Move on.
Laugh sensor (Score:2)
reminds me of the work of Karl Sims (Score:2)
This article is interesting, however, because it moves the agents into the physical world where it isn't possible to obtain the same kind of idealized environments that are possible in silico.
Re:reminds me of the work of Karl Sims (Score:2)
mark my words (Score:2)
This story published as the preface to... (Score:2)
Electronic edition, of course.
Not Evolution of Flight (Score:2, Interesting)
1.Equiping a test subject with wings short-circuits the most intreguing part of the experiment.
2.Equipping a winged test subject with a moter too heavy to maintain loft is stupidity at work.
3.Thrust is not lift. Flight requires both, but this was thrust. The robot recreated 19th and 20th century flying machines. They didn't work either.
4.Horizontal stabilizers (vertical rods) are not considered to have been available during the evolution of flight.
The test is intriguing, for sure. But to bill this as AI learned flight is either poor press coverage, or a scientist seeking funding through an uninformed press.
Learning to swim has been done (Score:2)
It's not really that hard to learn locomotion in a continuous environment, where you can improve a little bit at a time. Swimming and crawling were done years ago. Basically, you formulate the problem in terms of a measurement of how well you're doing, and provide a control algorithm with lots of scalar parameters that can be tweaked. You then apply a learning algorithm which tweaks them, trying to increase the success metric. Any of the algorithms for solving hill-climbing problems in moderately bumpy spaces (genetic algorithms, neural nets, simulated annealing, adaptive fuzzy control, etc.) will work.
There are limits to how far you can go with this technique, and they're fairly low. Gaits that require balance, such as biped walking and running, require more powerful techniques. Any task that requires even a small amount of prediction needs a different approach. Rod Brooks at MIT has explored the no-prediction no-model approach to control thoroughly, so we have a good idea now what its limits are.
What about the rest of the bird? (Score:2)
If I succeed, do I get a slashdot story?
This story is a follow-up... (Score:2)
~Philly
Re:Not so... (Score:2)
Actually, the trick to landing is to let gravity pull you onto a surface.
As they say in the pilot-world: "Any landing you can walk away from is a good landing."
Re:Not so... (Score:2)
Re:Not so... (Score:2)
Re:What a load of bollox (Score:2)
Yes - as usual the tabloids exagerrate the truth. Their mistake this time was to compare it to the entire *evolution*.
However, I still find the achievement quite impressive since it was not given explicit intructions how to overcome the obstacle to start with.
Re:Evolution??? (Score:3, Informative)
Likely a similar thing happened with dinosaurs turning into birds. The more webbed ones could jump farther and fall farther without getting hurt, and eventually one of them decided to flap its webs and they became wings. Feathers are just longer, more flexible scales, that make flying even easier.
I do agree that what the robot in the article did was not evolution, it was learning. It wasn't even learning a particularly useful form of 'flying', either; it was attached to vertical poles!
Re:Very interesting (Score:2)
"Attacking healthy red blood cells is Bad."
"Cutting through your neck is Bad."
And so forth...
Re:Learning to fly by trial and error (Score:2)
Re:Douglas Adams (Score:2)
Re:"Flapping Wings" isn't the same as "Flying" (Score:2)
(* Come on... Learning how to flap your wings isn't the same thing as flying... you should change the name of the story to something "Robot learns how to flap its wings". *)
Better yet, "Robot learns to generate lift by flapping its wings".
You are right in that there is much more to flying than simply generating lift.
But, the article stated that it couldn't fly because the motor was too heavy. But, there are bug-bots that *can* fly by flapping IIRC. Perhaps the two teams should get together.