Digital Biology 137
Digital Biology | |
author | Peter J. Bentley |
pages | 272 |
publisher | Simon and Shuster |
rating | 7.5 |
reviewer | Peter Wayner |
ISBN | 0-7432-0447-6 |
summary | Does a good job of bridging the analogical gap between the worlds of computers and biology; may not be deep but will probably enlighten readers with an interest in either or both of these fields. |
It should come as no surprise that the infatuation is requited because some biologists are just as fascinated with the bits that live in computers. They love to wonder whether the software crosses the line and become a sentient being, whatever that may be. They want to know whether a programmer can play Dr. Frankenstein and create life or at least an indistinguishable imitation. They are entranced with the computer's ability to boil vast amounts of data into a coherent answer and they want to harness this power to solve problems about truly organic creatures.
Peter J. Bentley's new book, Digital Biology is a lively tour through some of the research that joins both of these worlds. It's a quickly paced, colorful examination of how computer scientists and biologists can share metaphors like "the immune system" or "growth." If both groups sit down and compare metaphors, computer scientists may learn something about building robust, self-healing, self-reproducing software from looking at carbon- based creatures while biologists will learn something about creatures by studying them with silicon-based software software.
The book is aimed at the same market that embraced the meme of "Chaos" through reading James Gleick's book. The book is light on equations and heavy on showmanship. In many cases, this is more than satisfying. One description of digital flocks of birds describes how three simple rules can keep the birds floating and swarming with all of the coordinated rolling and swooping. There's no need to invoke numbers or distance measurements to convey what's happening.
At other times, the examples can be so tantalizing that the lack of depth can be a bit frustrating. Bentley promises "The number of different applications that we have successfully used evolution for is immense." To explain this, he offers an example of a coffee table designed by a computer program mixing, matching and cross-breeding varieties. After each generation, the computer cuts some desks apart, creates new combinations and then uses an equation to find the most fit and desirable desks. Eventually, a reasonable candidate emerges. After explaining that genetic algorithms may find patterns of credit card fraud and help us find better jet turbine blades, there's no space to tell us the finer details. We do learn that stunning results can emerge when computer programmers mix the three principles of inheritance, variation and selection. But no book can include everything.
While the book is aimed at a broad market, it does not come with many of the traditional flourishes of journalism. Bentley is research fellow at University College in London, not a newspaper hack who churns out stories for a living. So when he introduces other researchers and colleagues, he doesn't bother with dressing them up with details about their homes, their wives, or the usual chestnuts journalists offer in the hope of humanizing the subjects. The book focuses on the ideas and metaphors themselves and doesn't bother with the window dressing. The names are just incidental markers to give credit and a pointer for further research. Scientists will love the lack of distraction, but casual readers looking for colorful anecdotes about the wacky geniuses in labcoats will need to look elsewhere.
The book, as expected, is generally enthusiastic and heavily invested in the field. Software modeled on biological systems, we are told, will, "detect crime for us, identify faults, ... design new products for us, create art, and compose music."
Despite this partisan flavor, the book shines in the few paragraphs where Bentley pauses to discuss some of the limitations of the systems. "We cannot prove that evolution will find us a good solution-- but it almost invariably does. And we certainly cannot predict that solutions that evolution generates," he notes as a caveat to everyone planning to use genetic programming to solve world peace.
At one point, he discusses one of the principle criticisms of the entire area. After describing flourishing digital forests filled with fractal ferns, problem solving viruses, and swooping swarms of evolving birds and insects, he pauses and offers this quote from another biologist: "Where's the experiment?" He notes that most of these creatures are flights of our imagination untested in the lab against real ferns, viruses or birds. Nor is there any real way to test a fern hypothesis. The digital versions look real, but there's little gritty lab work to establish them as true metaphors for sussing out the secret laws of nature. Is looking real enough? Can you measure verisimilitude? Do any traditional experiments measure anything better than the quality of a simulacrum? Is appearance enough or is it only skin deep? After a pause, though, the book is on to more talk of big payoff and grand promises.
In its heart, the book is more a document that shows evolution of problem solving techniques. If you want to get the sales pitch from the computational biology world, you can turn to this book. When there were no machines, scientists used symbols, algebra, calculus and other mathematics to describe the world. Biologists have long employed differential equations to describe the booms and bust in ecologies of predators and prey. Now that we have computers capable of billions of operations in a second, we don't need the old school of mathematics to provide a closed-form solution. The computers can just simulate the world itself. There's no need to struggle for a set of equations that is both easy-to-solve and appropriate. We can just use little worlds of sims creatures, sim fronds, sim viruses, and sim antibodies.
Bentley's book is an ideal way to learn just how and why some biologists are absolute enraptured with the new powers discovered by these computer simulations of genetics, growth, flocking and other natural phenomenon. These models don't offer the kind of concrete certainty of mathematical models, but there's no denying that something is somehow there. Is it as much a breakthrough as Bentley believes? Well, maybe you the reader can create a genetic experiment to cross fertilize the ideas from the book with the ideas in your experience. After a few generations of thought, perhaps a few generations of beer, an answer might evolve.
Peter Wayner is the author of Free for All, a book on the open source software movement and Disappearing Cryptography , the second edition of a book on steganography expected to appear later this spring. He is also the author of several articles on simulation including studies of studies of the relationship between sex and AIDS , segregation , and the length of baseball games. (Each of these links includes a Java applet so you can run the simulator from your browser.) You can purchase Digital Biology from Fatbrain. Want to see your own review here? Just read the book review guidelines, then use Slashdot's handy submission form.
Digital? (Score:1)
yeah right (Score:1)
1-0-1
Re:yeah right (Score:1)
Warning (Score:1, Flamebait)
Re:Warning (Score:1)
Re:Warning (Score:1)
Re:Warning (Score:3, Insightful)
Unless, say, an Apple IIe in the wild mated and birthed a mutant Apple IIgs, which due to advantages in the environment lived to mate more and more with other machines, then thats not evolution. Just because something is advancing doesn't mean its "evolution".
Re:Warning (Score:1)
But that is exactly what it means. Not necessarily evolution due to natural selection, but still evolution.
Re:Warning (Score:1)
Re:Warning (Score:1)
Re:Warning (Score:2)
It's not biological evolution. Then again, that ought to be pretty obvious, since it's not biology.
I think it is safe to say that the Apple IIg had advantages in the business, economic, and academic environments, which enabled it to survive (for a time) - while the Apple IIe went extinct (at least, extinct in the sense that no more are being made).
Sure, you can be a purist and say this has nothing to do with evolution, but this discussion is *all about* drawing analogies between biology and computing.
Re:Warning (Score:1)
The opinions expressed above do not represent the opinions of the author.
Re:Warning (Score:3, Interesting)
Indeed. For those familiar with Artificial Intelligence, Genetic Algorithms and Genetic Programming, this should already be familiar.. but to enlighten the rest
When talking about AI you have to make a differentiation between the "body" and the "brain". In a computer simlulation you can say that the simulation environment is the body and the "genome" (phenotype) is the brain. Intelligence does not lie in either, but in the cooperation between them. Rather simple.. how much is your brain worth without eyes, arms, legs, nerves etc ? And the other way around.. what to do with your body if you cannot process the data.
So, back to the reflection on the comment above, most people tend to say that humans evolve through the brain, while it is more true to say that the brain and the body coevolve. The same goes for computers, both software and hardware coevolves, keeps getting better and matching each other.
The interesting part here is that if we can understand the body (not that hard, just molecular biology and stuff), then we have one of two keys to human intelligence. That is why biology and computer interaction is so interesting, if we can simulate biology (body) on a computer (body), then we have an increased ability to learn about the brain. A few months ago, an Israeli company successfully performed calculations on human cells. This is the reversed way, using biology as the "body" and computer algorithms as the "brain". Very interesting results, and very promising. This generation or the next should have a fairly good chance of screwing up this planet totally
Re:Warning (Score:1)
Please do not feed the hypocrites (Score:3, Insightful)
People with open minds may want to avoid this book
Re:Please do not feed the hypocrites (Score:2)
How precisely is belief in evolution 'damaging'? Compared to say: war, plague, famine, injustice, murder, disloyalty, dishonesty, etc, from matters great to little, I struggle to see how I am damaged by my belief (as a best working hypothesis) in evolution or how my belief in evolution could damage others. Even if you disagree with it, I'd suggest that you seriously need a sense of proportion. (This from a man who's posting on slashdot when he should be writing a 2000 word essay on comparative genomics...)
Boiling Frogs (Score:2, Insightful)
Saying that the frog jumps out immediately from the boiling water assumes: a) the water is suffeciently shallow for the frog to push against the bottom of the pot (I don't know if the pressure exherted against water would be enough to propel a frog out of a pot); b) that the difference in height between the water level and top of the pot is small enough for said amphibious hopper to get out; c) that being submerged in boiling water does not immediately disable the frog's jumping capacities.
I've fried crickets before (yes, I eat strange things), and when you toss them onto a hot pan with some oil (mmm.... butter), they simply don't have time to react before the proteins in their muscles are hydrolized. Not to be morbid about it, but I really don't think our frog has a chance in the boiling water.
Conversely, how dumb do we really think frogs are?? I mean, come on-- if you feel your legs scalding, don't you generally get out of the tub? Admittedly, when the temperature is raised gradually your heat tolerance increases. Indeed, people get so comfortable in saunas that they post warnings about brain damage from being in there too long. But come on. Is the frog really going to sit there and pass on blissfully to oblivion? A fish, I can understand. As the water gets hotter, fewer gasses can be dissolved in it. Since the fish breathes the dissolved gasses, it gradually suffocates. Which is beside the point, since the fish can't jump out of the pot in the first place, but you get the idea.
Somebody, please! Clear up this confusion! In the name of all that is analgous! In the meantime, I'm going to get back to my cricket stir-fry.
p.s. True science and true religion never conflict. To have a complete understanding of science is to understand the universe as it is. True religion is the same. Religion covers the why, science coveres the how. Since our understanding of both is imperfect at best, it's pointless to argue about frivolous details that don't pertain to our salvation. One way or another when we're all dead and sitting around in the waiting room, maybe there will be a documentary video playing in the VCR (DVD? What format do celestial beings use?). Then we can all nod our heads and say, "Oh, duh! Of course." Until then, deal with the fact that currently neither science nor religion has a monopoly on the full truth of "how" things came into being. Let science debate the how of the universe, let religion inspire us with the why, and what our purpose in it is.
Re:Boiling Frogs (Score:1)
Frogs are cold-blooded, and so they don't have a measuring stick to compare temperature. We can tell water is boiling by comparing it to out core temperature - forgs can't - as the water increases in temperature, the frogs core temperature does too. Frogs can't detect specific temperatures (nor can we, but we're certainly better at it), only changes in temperature.
Oh and your cricket thing. Water is never hotter than 100C. Fat can get to much higher temperatures. I don't think it would be true to say boiling water, because the frog would likely be dead from sitting in water at 50C, let alone 100 - hot water would be good enough.
Re:Boiling Frogs (Score:1)
Next Time, What Say We Boil a Consultant [fastcompany.com]
What is that I hear? Another "Ask Slashdot" question being typed in? Up up and away!
Re:Please do not feed the hypocrites (Score:2)
Hey, guess what? God just spoke to me, and told me that evolution is real. Feel better?
Re:Warning : Ignorance in the name of piety (Score:3, Insightful)
I cannot understand how seemingly intelligent people can ignore overwhelming scientific evidence. Evolution is the most widely explanation for how we came to be. I do not see any inconsistencies with the Genesis *metaphor* for the creation of life. The Bible is written by humans, not God. They may have had divine inspiration, but it was not God's pen in the inkwell. Why do you think there are four "Gospel according to XXXX"?
BTW, God is omniscient. Don't you think He can understand and use a metaphor?
Of all the types of ignorance in the world. Those that perpetuated under the guise of religion are the most virulent and dangerous.
Re:Warning : Ignorance in the name of piety (Score:2)
Because to most people Science is just as mysterious and magical as Religion.
Millions of children are enrolled in Sunday-School or fulltime religious school learning "You are not supposed to understand this". Truth has nothing to with logic or understanding. "Proof" of truth is not merely meaningless, but rejected as missleading.
Another goal of religious traning is rejecting competing religions. Science seems like just another religion to fight off - a bunch of ideas and beliefs that they don't expect to understand.
-
Re:Your sig. (Score:2)
Re:Warning : Ignorance in the name of piety (Score:1)
I thought even the creationists were abandoning this piece of "evidence". See this site [aol.com] for details.
Besides, even if you proved dinosaurs and man did have some overlap in the chronology of life on Earth, it certainly doesn't prove a six-day creation, or a 6000-year-old Earth. Once again, Creationists show their lack of comprehension not only of the scientific process, but also of simple logic.
Re:Warning (Score:1)
You must be a beginner. Flame bait has to be a little more subtle than that.
... blah blah blah
Re:Warning (Score:2)
Of course, 'believing' in the current scientific orthodoxy would be wrong too, in terms of having faith in it being 100% correct. Almost everybody who can call themselves a scientist would feel quite certain that they cannot be certain of its accuracy, and shouldn't try to be. Science works on scepticism and guesswork based on the data available, not faith.
Re:Warning - to the wrong book! (Score:2, Informative)
Creationsists only take issue with the scientific theory that Darwinian evolution can explain ALL of the biological phenomena. They cannot deny that evolution exists and works. They have only made arguments that it works too slowly to explain everything. Thus, this warning is extermely misguided.
Re:Warning - to the wrong book! (Score:2)
Sure they can. They do it all the time, hehe.
-
Re:Warning (Score:1, Insightful)
Re:elitism versus morality (Score:2)
I still have difficulty with the thought of a god to whom it really matters, if he/she/it has the power to create a universe in which one small part (that we know of) has free will and the capacity to vaguely guess at His nature and receive messages from Him should honestly care what we think of him? Surely it would be gross egoism to imagine that your opinion on its existence could rank so highly with something so vast and incomprehensible?
Re:Warning (Score:1)
Wait... (Score:1)
The Solution to all (Score:1)
It will also take out the trash, make your bed, screen calls from your annoying ex-girlfriend, make sure your milk is still good, tell you you're looking skinnier, and reprogram your TV to get all the good channels.
Re:The Solution to all (Score:1)
Software IS a living thing (Score:2, Insightful)
At least to some programmers....
Writing a living, breathing program would be the goal of many of us, not just AI programmers
Self Organizing Systems (Score:1)
Lots of examples of this book, came back in our practicums, there are nice links [cs.vu.nl] to sites about this subject on the page and also the complete course is online for you to download (not sure if my professor is going to be happy about this, but who cares, I passed the grade:)
Biologists and Psychologists Abuse this... (Score:4, Insightful)
My psych professor explained our language lecture using layman's computer terminology, instead of psychology. I wanted to strangle him the entire time. "So... you've got this memory stuff... and it get accessed - that is - processed, by this other bit over here, right, this area of the brain... let's just call that the "software"."
It was enough to make any techie of any note sick. He actually used Microsoft as a language. Talk about wanting to shoot someone.
But what can we do? Everyone thinks they're a programmer or a techie these days, and everyone thinks that because kids use IM they must have some other association with the grey box.
Sorry fellers, that's wrong. Most kids today don't know jack about computing, much less are able to relate better when you babble incessantly about things in your half-tech, half-psychologist manner. Stick to the psychology or the biology, instead of using computer terms to explain simple concepts. It's just more confusing and more hellish.
Re:Biologists and Psychologists Abuse this... (Score:2)
My psych professor explained our language lecture using layman's computer terminology, instead of psychology.
My understanding is that psychology has always chased the latest technology in its efforts to explain the mechanics of the brain. The brain has been compared to steam engines and grain mills, in their time (so I hear).
The biggest irony is when psychologists describe the brain as a neural network (the kind that's been modeled in computers), because the origin of idea for the neural network was the workings of neurons in the brain!
For this reason, many people insist that computer neural networks should be called artificial neural networks. Indeed, the artificial neural network is an interesting mathematical algorithm that takes its inspiration from "real" neural networks. It was never meant to be a model of the human brain by any stretch of the imagination.
You have to admit, though, the analogies are getting better. The brain is definitely more like a computer than a steam engine.
Re:Biologists and Psychologists Abuse this... (Score:2)
Okay, I'll bite. I don't think that's the case at all. Steam engines take in stored energy, release it , and move down a track. So do humans. The cells taken in ATP, release it, and move down some track. We don't know much about how these turn into decisions about whom to marry, which beer to drink, or how to mix the two together, but we know that energy is going in, and decisions are coming out.
A computer, on the other hand, is filled with logical gates that make straight-forward, well-defined decisions like AND, OR, or NOR. I hate to remind you, but there are many people that don't seem to have any connection with logic. They're really out to lunch. But they do take in energy and move down some track.
So just for the sake of argument, I think that the computer metaphor is moving in the wrong direction. Your track isn't pointed in the right way. You took in that energy, but it's not helping us at all.
Re:Biologists and Psychologists Abuse this... (Score:2)
Far be it for me not to bite back. :-)
Steam engines take in stored energy, release it , and move down a track. [...] The cells taken in ATP, release it, and move down some track.
Well, I think that's a rather superficial similarity, and it's not quite comparing apples to apples. If the brain converted sugars into mechanical energy and chugged its way along the spine, I would be more inclined to agree with you.
A computer, on the other hand, is filled with logical gates that make straight-forward, well-defined decisions like AND, OR, or NOR. I hate to remind you, but there are many people that don't seem to have any connection with logic.
Just because the base components are logic gates doesn't mean that the final output of the whole system needs to be logical. Why, there are probably huge numbers of people who would already describe computers as being unpredictable, irrational, and self-destructive... and this is when they weren't even programmed to be that way! (Okay, that's a half-joke.) (But only half.)
So just for the sake of argument, I think that the computer metaphor is moving in the wrong direction. Your track isn't pointed in the right way. You took in that energy, but it's not helping us at all.
I'll have to agree that the computer analogy doesn't help that much, and it's a point of argument whether it gives us any more insight into human psychology than a steam engine metaphor.
However, I didn't say the analogy was good; I just said it's better. And I would maintain that the computer, which, after all, helps us to make decisions (with varying degrees of perceived and real effectiveness), is closer to the brain than a steam engine, whose purpose is not related to decision making.
Re:Biologists and Psychologists Abuse this... (Score:2)
Later, about the time of Ben Franklin and Mary Shelley, they began to talk about thought as electricity. This really was a lot closer.
After another few decades, your brain became an internal telegraph-and-railroad system, and then a telephone exchange; and that brings us up to the era of the Giant Computing Machines that have afflicted the analogies of non-techies for the last 50 years.
Re:Biologists and Psychologists Abuse this... (Score:5, Insightful)
I'm tired of the comprarison of viruses to computer viruses, as well as DNA to computer code. Everytime an article on neural/silicon interactions comes on - here come the stupid Neuromancer "jacking-in" references! Every time a genetic engineering article comes here, people whip out "Jurassic Park" and "Chaos theory" to explain why they don't consider GM a good idea!
Mixing some computer and biological metaphors on a very BASE level has its uses, but people on both sides all too often become overly enamoured with these simple comparisons and forget the very REAL and often subtle differences that invalidate the metaphors.
A lot of coders I've met need to learn as much REAL (not popular) biology as the biologists you are complaining about need to learn about computers! Basically - I thought your comment was more than a bit one-sided and somewhat condescending - knowing alot about bits, pointers and registers doesn't make you any more qualified to mix metaphors than knowing alot about neurons, genes, and molecules does!
Sincerely,
Kevin Christie
Neuroscience Program
University of Illinois at Urbana-Champaign
crispiewm@hotmail.com
Re:Biologists and Psychologists Abuse this... (Score:4, Insightful)
Excuse me? Surely both examples you give are excellent analogies of each other. Viruses parasitically use the machinery of their hosts to spread themselves... and so do computer viruses (well, worms at least.)
And DNA is a digital series of instructions that are interpreted to express something... and so is computer code. Has anyone proved you can build a Turing machine in DNA yet? Admittedly, DNA is processed in a rather more analogue fashion than most computer code, but as an an analogy, it's better than most; (for example) the old one about breaking computer systems/ breaking into a house.
Re:Biologists and Psychologists Abuse this... (Score:1)
Re:Biologists and Psychologists Abuse this... (Score:1)
Re:Biologists and Psychologists Abuse this... (Score:3, Insightful)
Surely both examples you give are excellent analogies of each other.Viruses parasitically use the machinery of their hosts to spread themselves... and so do computer viruses (well, worms at least.)
Computer viruses do not physically dis-assemble the host computer (or its OS), chopping it up into pieces that are re-assembled to from new infections computer viruses. A big difference.
And DNA is a digital series of instructions that are interpreted to express something... and so is computer code.
Digital series indeed!! DNA is more like a recipe as S.J.Gould and others never tire of pointing out (evidently for a good reason). If it were a program it would be the buggiest, crappiest program that had been maintained for years with different compatability layers added to it. If it were a program it would be as though COBOL had been kept and had new libraries added to it, some of which worked sort-of, and others were completely b0rken.
The point of this is that the analogies/metaphors/comparisons are not really useful beyond a simple level. Interesting analogies or metaphors are ones that reveal _unexpected_ details about the analogised subject. The code/DNA one does not. It is just two cool things lumped together with superficial similarity.
Re:Biologists and Psychologists Abuse this... (Score:2)
Digital series indeed!! DNA is more like a recipe as S.J.Gould and others never tire of pointing out (evidently for a good reason).
I'm sorry, DNA can obviously be percieved as a digital sequence. There are four distinct states encoding the information. ("recipe", whatever) I hope it's clear that it's not analog at least.
And as for recipe vs. program, they're the same thing! A sequence of instructions describing how to perform some action. Computer code is usually laid out in a more deliberate and structured form because the "operator" is so much simpler and more precise, but that doesn't really change the core nature of the thing. In the case of DNA, things are different yet again.
Of course if you don't understand anything about a recipe, comparing recipies and computer code is useless as well. The analogy to computer code is valid, but not perfect. You can't really use it unless you know where it doesn't work. Just because you can over-extend it doesn't mean it's complete trash.
Re:Biologists and Psychologists Abuse this... (Score:1)
This statement is formally correct, but highly misleading. The whole digital vs. analog paradigm implies that DNA simply contains a signal as a function of position, but this is not the case. It's an actual molecule in the real world; the conformation of the molecule matters for important things like transcriptional regulation. This is more easily illustrated with protein sequences. They are also "digital," with twenty states instead of four. However, the behavior of protein molecules of known sequence is not ab initio predictable in practice for sequences of any useful length.
Re:Biologists and Psychologists Abuse this... (Score:1)
Take a ~1 MB (source code) computer program, written in a language you don't understand. Try predicting what it will do without actually compiling and running it. Same problem...so I'm not surprised this is the case. OTOH, very small sequences can be predicted, it's just that the sizes which we can predict don't happen to be usably long.
Re:Biologists and Psychologists Abuse this... (Score:2)
the behavior of protein molecules of known sequence is not ab initio predictable in practice for sequences of any useful length.
Exactly. Even if one is able to use a method like threading, or multiple-alignment to find similarities or shared motifs the behaviour of the protein depends upon the complement of other molecules in its vicinity. If anything the whole deal is much closer to an analog system than a digital one.
Re:Biologists and Psychologists Abuse this... (Score:1)
Re:Biologists and Psychologists Abuse this... (Score:2)
Depends on what you mean by 'building a Turing Machine in DNA'. Your question is rather analogous to saying 'has anyone proved you can build a Turing machine with SDRAM yet?'. DNA is a storage medium and requires external mechanisms to operate on it for computation.
If you take DNA and the DNA-processing mechanisms from stichotrichous ciliates, then yes, you *can* build a Turing Machine. See:
Reversable Molecular Computation in Ciliates. Kari, L., Kari, J., Landweber, L.F., _Jewels are Forever_ (Karhumaki, J. et. al eds.), Springer-Verlag, 1999, pp 353--363
Thats particularly interesting because its _in vivo_ computing, but there are also tens (probably hundreds) of proposals for _in vitro_ DNA computing. Do a google search on 'DNA Computing' and look for the proceedings of the International conferences on DNA-based Computers (I believe the one this summer in Japan is number 8)... IIRC they're published under a DIMACS series. If you want a 'canonical' paper for _in vitro_ methods, I guess starting at the beginning would be the best:
Adleman, L., "Molecular Computation of Solutions to Combinatorial Problems," Science, Vol. 266, 11 November 1994, pp. 1021-1023.
Re:Biologists and Psychologists Abuse this... (Score:2)
Re:Biologists and Psychologists Abuse this... (Score:1)
It's not infinitely more complex. Ignoring computational power, I see nothing there that can't be reproduced in code. Just because the code we usually write is quite straightforward doesn't mean it has to be. Self-modifying code, code based on chaotic equations, etc. are all possible. In fact I'm working on a research project at the moment looking at how programs can be evolved in similar representations to biological genomes.
Incidentally, if it was infinitely less understood, or infinitely complex then we wouldn't be able to understand any of it. Perhaps the mathematicians should take you to task for mangling their terminology?
The former has an insanely complex web of interactions with promoters, enhancer regions, transposons, developmental effects, odd things like RNAs which can code for proteins as well as act as the catalytic subunits of enzyme systems themselves!
So? Just because the systems are complex doesn't make your argument better. It certainly didn't warrant an exclamation mark. Try interpreting Windows XP in binary sometime. Besides, I think most (or all) of the things you mentioned have been used in GAs/GP.
Interestingly, I have heard more than one geneticist call the genome 'essentially a computer program'.
Re:Biologists and Psychologists Abuse this... (Score:1)
Re:Biologists and Psychologists Abuse this... (Score:1)
Re:Biologists and Psychologists Abuse this... (Score:2)
No, I have to disagree. There is no difference philosophically between a computer program and the data accessed by a computer program. Any interpreted code, for example, (Perl scripts etc.) are data that are processed by a program. A Turing machine doesn't make the distinction on its infinite roll of tape.
Neurons and muscle cells have the same "program," but the "computer" is different.
The computer is exactly the same in both cases. The genome is interpreted differently in muscle cells and neurons because the genome itself turns on or off parts of itself in different places in the body.
Or, more precisely, the transcriptional machinery the previous poster mentioned, in addition to signal transduction networks in the cytoplasm, epigenetic influences, and a few other things are part of the computer that runs the program of the genome. But where are these things in the digital computer implementation you speak of? They're in SOFTWARE.
All that transcriptional machinery is built up from instructions in the genome: ie in SOFTWARE.
The geneticists you've been talking to almost certainly know better and should be more careful what they say.
I haven't got time to try and find a quote from someone authoritative on it, but Matt Ridley in Genome which seems to be looked upon favourably by most geneticists calls the program/genome comparison something like a perfect analogy.
Obviously the structure of a program and a biological genome are typically very different, but I think from an abstract sense a program is very similar to a genome.
Re:Biologists and Psychologists Abuse this... (Score:1)
I think I agree with everything except one point, and some lack of clarity in my previous post is getting in the way. The only thing (besides the genome itself) that has not changed between neurons and myocytes is the underlying laws of physics. Whether or not it is your intention, I think this is the computer you're referring to, which is what I ultimately agree with. When I said the computer has changed, I was referring to the idea that the things that actually act directly on the genome were the computer, i.e. the transcription factors and RNA polymerase. Since both of us think that's incorrect, we can stick to the idea that the computer is analogous to underlying physical laws. What you're saying is that the different transcription factors and so forth that mediate differentiation are also in software, which I agree with too. I think we actually agree on all of that. The one difference is that you're arguing that there's no difference between a program and data accessed by a program, so the genome is like a program written in an interpreted language. Actually, I can agree with that too, in a formal sense. The problem is that this argument can be extrapolated all the way out to any deterministic system, and almost anything can be analogous to a computer program. So what's the point of the analogy?
I think the issue is less whether the analogy is formally correct at some arbitrarily abstract level and more whether it's a "good" analogy or not. A human being and a donut are topologically equivalent (donut hole:mouth->anus), but I'm not aware of any analogy based on that fact that most people would call "good." If you say that the genome is analogous to a program written in a very high-level interpreted language that partially codes for its own interpreter (e.g. photoshop macro language with extensions), then I would agree that this could be formally valid. Nevertheless, it's a bad analogy. First, it's not specific to the genome--almost anything can be said to be analogous to a computer program if you apply that level of abstraction. More importantly, it's misleading. Unless you already understand a significant amount about molecular biology and computer science, the analogy is going to mislead you. The genome:computer program analogy is bad, and no number of pop science books is going to change that. Of course, it may be much better than the genome:immortal soul or genome:collective unconscious analogies...
Re:Biologists and Psychologists Abuse this... (Score:1)
The genome:computer program analogy is bad, and no number of pop science books is going to change that
A cheap shot
A human being and a donut are topologically equivalent (donut hole:mouth->anus), but I'm not aware of any analogy based on that fact that most people would call "good."
Oddly enough I remember a biology lesson in school where a teacher used exactly that analogy. I found it useful as a way of understanding that the alimentary canal was a tube through the body and not actually "inside" the body.
The problem is that this argument can be extrapolated all the way out to any deterministic system, and almost anything can be analogous to a computer program.
To counter that statement properly would probably require moving into the realms of information theory and lots of statements about entropy. I don't think I'm up to that, but IMHO there are plenty of similarities between the two that few other systems would share; a finite alphabet, a modular structure, variables, mechanisms to alter their interpretation. Both are linear, abstract lists of instructions that can repeatably influence the macroscopic world when encapsulated in the right medium.
So what is the point of the analogy?
An analogy doesn't have to have a point. It can just be a similarity between two things. (Look up the definition.) But if you mean what use is it, then I can think of two:
Going back to the original thread, I was listening to a speaker today about how he was developing artificial immune systems to counter (computer) viruses. Another example where a biological analogy helps.
Re:Biologists and Psychologists Abuse this... (Score:1)
Re:Biologists and Psychologists Abuse this... (Score:1)
As a neuroscientist and former CS major in college (and long time Slashdot reader) I can also assert that programmers abuse biology metaphors just as badly!
[/quote]
amen!
i was very amused for example by the incorrect parrallel drawn by linus & friends on the LKML lately, regarding linux development as an evolutionary process. linux development is directed by a (group of) person(s), which can hardly be compared to the way nature randomly applies selective pressure onto a living organism.
i could take this argument further, but i don't want to be filtered out as a page-lengthening post
nevertheless, the parallel was instrumental in the sense that it got the discussion going. and let's face it, this particular discussion even made it to slashdot, so it must have been important...
Re:Biologists and Psychologists Abuse this... (Score:1)
DNA isn't code, because DNA doesn't *do* anything. DNA defines the total set of processes that can be expressed in a cell for an organism. This can be seen by the fact that the roughly 200 cell types in a human only express 10 percent or less of the available DNA.
Thus DNA is the definitions avaiable for a system. Just like that pile of installation disks that an MIS department holds for configuring a computer system.
When DNA is expressed, it results in RNA (to really simplify the transcription process). RNA is the "program store" for a cell. RNA defines the processes that are active in a cell. This isn't at all unlike the programs you have installed on your computer system. What processes a cell will perform and its configuration is driven by the particular genes that have been expressed (copied into RNA) for that cell. These are selected in such a way that the cell can effectively perform its role.
Yet actually carrying out these processes requires yet another step, the translation of RNA into the proteins that drive biological processes. Just like loading a program into memory and executing it is required to have something happen in a computer, building proteins is required to make a process occur in a cell.
A program store serves a very useful purpose. Both in computer systems and biological cells, it allows a cell to be instantly ready to perform the tasks required of it. If another protein is needed, the RNA is ready and able to translate another copy. Or on a computer system, if you need to run that spreadsheet, that program is ready to be load another copy in to memory for your use.
The Central Dogma of Molecular Biology states:
DNA -> RNA -> Proteins
More generally, this can be understood as:
definition -> expression -> function
What we need to learn to do in computer systems is understand how the architecture in biological systems results in self configuring distributed processes. We can do it the hard way (where we fight the mechanical metaphors such that every positive step to more configurable computer systems requires a non-intutive step against the mechanical metaphor), or we can do it the easy way (where we recognize that biological systems have already *solved* the configuration and distribution problems).
The really amazing thing to me about biological systems and computer systems is how long this simple observation has escaped everyone. The mixed-up metaphors (like Microsoft's DNA) are simply painful.
--Paul
http://groups.yahoo.com/group/SoftwareG
Re:Biologists and Psychologists Abuse this... (Score:2)
Metaphor is a difficult thing, because you are attempting to illustrate an independant reality using something that is familiar to the listener. The danger is that the listener may mistake the analogy for formal equivalency. I tend to open my lectures with bold metaphor (the brain is a swiss army knife...)to try to grab students attention and give them a conceptual framework, and then often spend the rest of the lecture (term?) fighting this phantom notion of formal equivalency (no a ligand is not a key, and a receptor is not a lock). Formal concepts are difficult to absorb, just think of metaphor as the vehicle that carries ideas (whoops there I go again). Of course if your professor does not pick apart his/her own metaphor by the end of the term, then perhaps he/she is doing a disservice to the less engaged students.
What I find really fascinating is how teaching and general discussion is limited by metaphor. When discussing things among collegues there is little elaboration of many things because they grok the independant reality of a phenomena as well as I do (probably better). The metaphors come out when you need to discuss/explain something to the less-expert. I don't think that anyone ever believed that working memory was a "scratch-pad" or that long-term memory was a "tape-recorder". While many of my collegues might argue for the brain as universal-Turing-machine in the formal conception, I don't think any of us believe that the brain is a computer like the one on your desk (we are analog). All of us in our technical capacities just lack the language to express our ideas to the less technically adept.
Re:Biologists and Psychologists Abuse this... (Score:2)
Wacky scientists (Score:2)
If you want that kind of thing, this book is amazing for presenting both sides (ie, the science & the people) of the stories:
http://www.amazon.com/exec/obidos/ASIN/06718723
It's called Complexity. It is a kind of answer to 'Chaos', and it has much info on the kind of biological software that the Santa Fe Institute crowd was working on a few years ago. A very highly recommended read.
Information wants to be Anthropomorphized... (Score:3, Insightful)
Case in point. When I was helping my mother restore her computer after she was infected with Code Red, she was infuriated at the worm. While she is a computer professional, she is not a coder and has no understanding of... say... how machine code executes a loop or a goto. She talked about Code Red as if it really was a living thing despite the fact that she knew better. One of the things she said that stuck in my head was 'Why would it do that to me?'
Re:Information wants to be Anthropomorphized... (Score:1, Insightful)
Can we get beyond this? I don't think so. We're humans, after all. We only know human things. Maybe licking your fur all day changes your perception of the world. Maybe sniffing butts changes the mind of dogs. Projecting human thoughts may not just be the best way to try to understand this, it may be the only way. (Well, save from licking yourself all day and sniffing butts.)
Abstraction? (Score:1)
No one speaks of subroutines that cp themselves through undocumented remote procedure calls because talk of 'computer viruses' carries all of the portent and weight of polio, anthrax, German Measles and tuberculosis.
Yeah, no-one speaks of the exact way all these illness viruses work either since it's easy to abstract it out to a simple term 'virus'/
Computers and biology have already merged (Score:1)
nouns? verbs? help! (Score:1)
---No one speaks of subroutines that cp themselves through undocumented remote procedure calls because talk of 'computer viruses' carries all of the portent and weight of polio, anthrax, German Measles and tuberculosis.---
Re:nouns? verbs? help! (Score:1)
Re:nouns? verbs? help! (Score:1)
Yes, there are too many syllables. Scully and Mulder need short phrases like "DAT tape." The biologists are just more poetic. Three letter acronyms sound stupid, but programmers use them all of the time. Programmers love words with plenty of consonants. Biologists love vowels. Vowels are sexier and more ominous too.
Here's another great link (Score:4, Insightful)
Plenty of good stuff. Anyone have other good links?
Re:Here's another great link (Score:3, Insightful)
This one [red3d.com] is my favorite. You can watch the flocking boids, and it explains flocking algorithms very clearly and easily.
Plus, it's got links to about 50 other really interesting biological modeling and application sites.
Be careful not to take this too far. (Score:4, Insightful)
For example, from the review above:
genetic algorithms may find patterns of credit card fraud and help us find better jet turbine blades
The genetic algorithm is a great algorithm for optimization problems. However, its not significantly more effective than the simulated annealing [sciam.com] algorithm or the less-known controlled random search [dl.ac.uk] algorithm.
Each has its advantages and disadvantages, but getting too caught up in the metaphors these algorithms and techniques are based on will unnecessarily shackle your thinking. Of course, the opposite is also true. Refusing to embrace metaphors at all will leave you without the insights that we use metaphors to see, so don't take me too seriously
Re:Be careful not to take this too far. (Score:1)
Re:Be careful not to take this too far. (Score:2)
Could you post a better link, or explain the difference yourself? Thanx.
-
Re:Be careful not to take this too far. (Score:2)
The CRS is based on the 'Nelder-Mead' simplex method. Here's a better description of that. [enst-bretagne.fr]
It starts with n + 1 points where n is the number of parameters you're optimizing. That's 3 points forming a triangle in 2D space or 4 points forming a tetrahedron in 3D space (the space being the values you're optimizing). Its easiest to think of the 2D situation with the triangle.
Each corner of the triangle is some set of parameters, each of which will have a different 'fitness'. The fitness is the value that you're trying to minimize. Evaluate the fitness at each vertex of teh triangle. Take the largest "least fit" vertex, and 'step the triangle downhill' by reflecting it through the midpoint between the other two points.
This should create a reflected triangle closer to the fitness minimum that you are trying to find. repeat until you get so close to the minimum that you're going in circles.
Now, with the CRS, you use the simplex to take all your steps, but in this case you create a large pool of initial candidates at random, just like you do for the Genetic or Simulated Annealing algorithms. The you create a new simplex by selecting n+1 elements from your initial pool. Step the simplex downhill, and see if your new value is better. IF so, throw away the worst element of the initial population, and replace it with the new one. Then select a new simplex at random from your pool of candidates and repeat the procedure.
This way, you're always producing random steps, so you can't easily get caught in a local minimum, and its a pretty efficient solution. It works well with linear constraints, which seemed to be an advantage over GA and SA when I was working on this myself. I should put a discaimer on here that I'm a geophysicist, not a computer scientist, so I may not use the lingo the same way your average
Hope that wasn't too confusing. I'm trying to write this without my boss knowing I'm not working >:)
Re:Be careful not to take this too far. (Score:2)
It seems like it would only work on a well-behaved search space. SA and CRS are faster with some search spaces, but I still preffer GA. It can attack any problem that SA and CRS can, plus it works on ugly search spaces. In particular CRS and SA seem almost usless for creating a program, one of the coolest tasks for GA.
-
Re:Be careful not to take this too far. (Score:1)
We need to take a few ideas from the natural world, but there's no reason to imitate it slavishly. Why not have strange mating rituals for these digital memes that are supposedly evolving? Why not be a bit more mathematically precise? We might actually prove something for a change instead of talking about how cool it all is.
Biology vs. Comp. Sci. (Score:3, Insightful)
The first thing that strikes me when biology and computer science are brought together is that although we try to apply principles of the former to the latter, we really have a much firmer grasp of computer science than we do of biology. What we're really doing, I think, is taking some theories and concepts from biology -- evolution and immunology seem to be the big ones -- and adapting those theories to suit digital computers; we're not modelling life per se. It's important to remember, too, that although we can model evolutionary processes like variation and selection in a computer system and produce the anticipated results, we can't thereby prove that evolution applies to life. (I happen to believe that it does, but I have to admit that we have yet to irrefutably prove it). All we're doing is nicely illustrating the theory.
Someone mentioned earlier that everyone claims to be some sort of computer expert these days, and that biologists and psychologists routinely misapply computer concepts. From my perspective, the reverse is true. There seems to be a misconception that biology is straightforward and well-understood, and I just don't know where that comes from. I'm sure I'm not the only biologist who grimaces when "virus" is used to describe software. And the most gaping errors in science fiction always seem to be ones of biology. Computer scientists use words like "genotype" and "phenotype", but genetic algorithms seem to me to be so far removed from the actual complexities of gene expression as to be at best distant cousins. It's more a matter of biology lending ideas and inspiration to computer science than it is some direct translation of life processes to software processes.
Re:Biology vs. Comp. Sci. (Score:2)
But "illustrating the theory" is really everything that ever can be done in science. In math (and CS is really applied math, despite the name) things can be proved. In science, theories can never be proved because it is always possible that someone tomorrow could perform an experiment disproving the theory.
Re:Biology vs. Comp. Sci. (Score:1)
Think about it: "Survival of the fittest" That means survival of those that able to survive.
You can't argue with this.
Re:Biology vs. Comp. Sci. (Score:1)
Actually, I revise this for my own personal reference as: "Destruction of the unfit", since the "Survival of the fittest" implies that ONLY the fittest will survive. Your general point, however, is still perfectly valid :-)
And Then There's Physics... (Score:1)
Re:And Then There's Physics... (Score:1)
What about differential equations? (Score:1)
So are biological metaphors just as suspect? Perhaps. Digital evolution is cool, but I don't see why it is better than any of the other optimization techniques. If anything, the digital bio metaphor forces you to mimic creatures and all of their semi-monogamous, one-on-one reproduction. Equations don't have to conform to such a binary vision.
Why we can't model biology with computers (Score:1)
Biological systems are sensitive on quantum level and computers certainly cannot be.
Re:Why we can't model biology with computers (Score:2)
So computers must be able to measure single photons, otherwise how did the physicists know that they were emitting a single photon? And to go from single-photon-detection to whole-organism-response only requires a long series of amplification cascades. Why is such a setup so hard to envision in a computer system?
Re:Why we can't model biology with computers (Score:1)
Re:Why we can't model biology with computers (Score:2)
When you really get down to it, most biological processes aren't analog. Instead, they're regulated by molecules that can take on a finite number of states. Given, the number of molecules involved is fantastically large, and the number of states they can take is almost always more than 2 (especially since you have to take the effect of things like protein misfolding due to mutation into account).
So yes, it's relatively simple (heh) to produce a computer-based system that's as complicated as a biological one. But to replicate a biological system we'd have to know every X molecule, and all of the resultant Z triggers that can result from Y concentration of X. Then, we'd have to already know how all of the different X molecules connect to eachother (in ways as subtle as "you can't make any more X1 because all of the zinc was used to make X2").
However, while we can't replicate biological systems (and probably never will be able to), we certainly can model them. This is much easier, since we interweave a bunch of different functions in an attempt to arrive at something that generally makes sense. Then try to model some situations where the result is already known. If your model matches reality in almost every case, then you've probably got a winner. Otherwise, Do Not Pass Go.
Give people more credit. (Score:2)
Let's say I'm trying to explain a concept in molecular biology to a computer scientist. Is it really so bad if I make an analogy connecting something the computer scientist already knows (programming, for example) and something he or she does not know (MAPK pathways, for example)? As long as the analogy holds up on the level that I explain it at, things should work fine.
But because neither the computer scientist nor the biologist are stupid, they won't take the analogy too far. The computer scientist won't immediately think, "I bet obscure programming fact XXXX holds for this biological system he's explaining to me, because he just used programming language YYYY in his metaphor." This won't happen because the computer scientist is a rational person, who knows what a metaphor is and its probable limits.
Yes, it's true that if everyone takes metaphors literally, then we'll run into problems. But the entire reason we can use metaphors for something useful, is that we can also also understand that a metaphor can break down at some point.
I'll admit, I get pissed when popular culture misquotes some arcane (or even general) biological principle. However, that's a totally different thing than using some other subject as a metaphor. Without metaphors, those involved would have to learn these things from scratch, without drawing upon what one already understands. I think it's totally valid to dispense snippets of information through metaphor, since the alternative is working one's way up from ground zero without using metaphor. And that's way too much to ask, considering in biology it takes a PhD for anyone to consider you above zero level.
Viruses, biological vs not comparison (Score:2, Informative)
Please use metaphors that make sense (Score:2, Informative)
viruses?? (Score:1)
viruses named:
polio
common cold
German Measles
bacteria passed off as viruses:
anthrax
tuberculosis
black plague
*sigh*.