Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Science Books Media Book Reviews

Digital Biology 137

Peter Wayner writes: "Metaphors drawn from biology have always fascinated computer scientists. No one speaks of subroutines that cp themselves through undocumented remote procedure calls because talk of 'computer viruses' carries all of the portent and weight of polio, anthrax, German Measles and tuberculosis. Invoking these mysterious and deadly images is more colorful than tech speak, even if most of the so-called viruses are closer to the common cold than the black plague. Why use a three-letter acronym when a biological metaphor is available?" Wayner wrote the following review of Peter J. Bentley's book Digital Biology, which may just answer that question.
Digital Biology
author Peter J. Bentley
pages 272
publisher Simon and Shuster
rating 7.5
reviewer Peter Wayner
ISBN 0-7432-0447-6
summary Does a good job of bridging the analogical gap between the worlds of computers and biology; may not be deep but will probably enlighten readers with an interest in either or both of these fields.

It should come as no surprise that the infatuation is requited because some biologists are just as fascinated with the bits that live in computers. They love to wonder whether the software crosses the line and become a sentient being, whatever that may be. They want to know whether a programmer can play Dr. Frankenstein and create life or at least an indistinguishable imitation. They are entranced with the computer's ability to boil vast amounts of data into a coherent answer and they want to harness this power to solve problems about truly organic creatures.

Peter J. Bentley's new book, Digital Biology is a lively tour through some of the research that joins both of these worlds. It's a quickly paced, colorful examination of how computer scientists and biologists can share metaphors like "the immune system" or "growth." If both groups sit down and compare metaphors, computer scientists may learn something about building robust, self-healing, self-reproducing software from looking at carbon- based creatures while biologists will learn something about creatures by studying them with silicon-based software software.

The book is aimed at the same market that embraced the meme of "Chaos" through reading James Gleick's book. The book is light on equations and heavy on showmanship. In many cases, this is more than satisfying. One description of digital flocks of birds describes how three simple rules can keep the birds floating and swarming with all of the coordinated rolling and swooping. There's no need to invoke numbers or distance measurements to convey what's happening.

At other times, the examples can be so tantalizing that the lack of depth can be a bit frustrating. Bentley promises "The number of different applications that we have successfully used evolution for is immense." To explain this, he offers an example of a coffee table designed by a computer program mixing, matching and cross-breeding varieties. After each generation, the computer cuts some desks apart, creates new combinations and then uses an equation to find the most fit and desirable desks. Eventually, a reasonable candidate emerges. After explaining that genetic algorithms may find patterns of credit card fraud and help us find better jet turbine blades, there's no space to tell us the finer details. We do learn that stunning results can emerge when computer programmers mix the three principles of inheritance, variation and selection. But no book can include everything.

While the book is aimed at a broad market, it does not come with many of the traditional flourishes of journalism. Bentley is research fellow at University College in London, not a newspaper hack who churns out stories for a living. So when he introduces other researchers and colleagues, he doesn't bother with dressing them up with details about their homes, their wives, or the usual chestnuts journalists offer in the hope of humanizing the subjects. The book focuses on the ideas and metaphors themselves and doesn't bother with the window dressing. The names are just incidental markers to give credit and a pointer for further research. Scientists will love the lack of distraction, but casual readers looking for colorful anecdotes about the wacky geniuses in labcoats will need to look elsewhere.

The book, as expected, is generally enthusiastic and heavily invested in the field. Software modeled on biological systems, we are told, will, "detect crime for us, identify faults, ... design new products for us, create art, and compose music."

Despite this partisan flavor, the book shines in the few paragraphs where Bentley pauses to discuss some of the limitations of the systems. "We cannot prove that evolution will find us a good solution-- but it almost invariably does. And we certainly cannot predict that solutions that evolution generates," he notes as a caveat to everyone planning to use genetic programming to solve world peace.

At one point, he discusses one of the principle criticisms of the entire area. After describing flourishing digital forests filled with fractal ferns, problem solving viruses, and swooping swarms of evolving birds and insects, he pauses and offers this quote from another biologist: "Where's the experiment?" He notes that most of these creatures are flights of our imagination untested in the lab against real ferns, viruses or birds. Nor is there any real way to test a fern hypothesis. The digital versions look real, but there's little gritty lab work to establish them as true metaphors for sussing out the secret laws of nature. Is looking real enough? Can you measure verisimilitude? Do any traditional experiments measure anything better than the quality of a simulacrum? Is appearance enough or is it only skin deep? After a pause, though, the book is on to more talk of big payoff and grand promises.

In its heart, the book is more a document that shows evolution of problem solving techniques. If you want to get the sales pitch from the computational biology world, you can turn to this book. When there were no machines, scientists used symbols, algebra, calculus and other mathematics to describe the world. Biologists have long employed differential equations to describe the booms and bust in ecologies of predators and prey. Now that we have computers capable of billions of operations in a second, we don't need the old school of mathematics to provide a closed-form solution. The computers can just simulate the world itself. There's no need to struggle for a set of equations that is both easy-to-solve and appropriate. We can just use little worlds of sims creatures, sim fronds, sim viruses, and sim antibodies.

Bentley's book is an ideal way to learn just how and why some biologists are absolute enraptured with the new powers discovered by these computer simulations of genetics, growth, flocking and other natural phenomenon. These models don't offer the kind of concrete certainty of mathematical models, but there's no denying that something is somehow there. Is it as much a breakthrough as Bentley believes? Well, maybe you the reader can create a genetic experiment to cross fertilize the ideas from the book with the ideas in your experience. After a few generations of thought, perhaps a few generations of beer, an answer might evolve.


Peter Wayner is the author of Free for All, a book on the open source software movement and Disappearing Cryptography , the second edition of a book on steganography expected to appear later this spring. He is also the author of several articles on simulation including studies of studies of the relationship between sex and AIDS , segregation , and the length of baseball games. (Each of these links includes a Java applet so you can run the simulator from your browser.) You can purchase Digital Biology from Fatbrain. Want to see your own review here? Just read the book review guidelines, then use Slashdot's handy submission form.

This discussion has been archived. No new comments can be posted.

Digital Biology

Comments Filter:
  • Does anybody remember when analog biology was good enough?
  • I know what they mean:

    1-0-1

  • Warning (Score:1, Flamebait)

    by Anonymous Coward
    This book, like so many other supposedly balanced books on biology, assumes that evolution is true and pushes it at every possible opportunity. People with open minds may want to avoid this book, and members of the moral community will definitely not want to read this. "Analog" biology is a controversial enough subject, there is no need to read about digital biology.
    • by PD ( 9577 )
      It's easy to see why evolution is a hard concept for you to grasp. Trust me, after you go through it once, things will be much easier for you.
    • I love this comment

      People with open minds may want to avoid this book

    • I would probably not respond to the above and disregard it if it were not for a recent Sci Am article that showed ~40% of Americans believe in creationism over evolution.

      I cannot understand how seemingly intelligent people can ignore overwhelming scientific evidence. Evolution is the most widely explanation for how we came to be. I do not see any inconsistencies with the Genesis *metaphor* for the creation of life. The Bible is written by humans, not God. They may have had divine inspiration, but it was not God's pen in the inkwell. Why do you think there are four "Gospel according to XXXX"?

      BTW, God is omniscient. Don't you think He can understand and use a metaphor?

      Of all the types of ignorance in the world. Those that perpetuated under the guise of religion are the most virulent and dangerous.
      • I cannot understand how seemingly intelligent people can ignore overwhelming scientific evidence.

        Because to most people Science is just as mysterious and magical as Religion.

        Millions of children are enrolled in Sunday-School or fulltime religious school learning "You are not supposed to understand this". Truth has nothing to with logic or understanding. "Proof" of truth is not merely meaningless, but rejected as missleading.

        Another goal of religious traning is rejecting competing religions. Science seems like just another religion to fight off - a bunch of ideas and beliefs that they don't expect to understand.

        -
    • > ... evolution bad ... moral = stupid ... etc...

      You must be a beginner. Flame bait has to be a little more subtle than that.

      ... blah blah blah ... abortion ... blah blah blah ... racism ... blah blah blah ... religion ... blah blah blah ... nazi ... blah blah blah ...

    • In saying that this work believes in the reality of evolution, you are saying that it works on the assumption that the current scientific orthodoxy is accurate. And believe me that, as a biologist, we are just as certain about evolution as rocket engineers are that their designs are propelled by fuel etc rather than the will of God.

      Of course, 'believing' in the current scientific orthodoxy would be wrong too, in terms of having faith in it being 100% correct. Almost everybody who can call themselves a scientist would feel quite certain that they cannot be certain of its accuracy, and shouldn't try to be. Science works on scepticism and guesswork based on the data available, not faith.

    • Even ardent Creationists cannot deny the 'fact' of hereditary mutation, selection, and hence, evolution. Christians have manipulated the genetic stocks of plants and animals for centuries. This is EXACTLY the same thing that Genetic or Evolutionary Programming is doing.

      Creationsists only take issue with the scientific theory that Darwinian evolution can explain ALL of the biological phenomena. They cannot deny that evolution exists and works. They have only made arguments that it works too slowly to explain everything. Thus, this warning is extermely misguided.
      • Even ardent Creationists cannot deny the 'fact' of hereditary mutation, selection, and hence, evolution... They cannot deny that evolution exists and works.

        Sure they can. They do it all the time, hehe.

        -
    • Re:Warning (Score:1, Insightful)

      The "moral community"? Gee, there's no unwarranted elitism in that, is there? From my life experiences, people who accept evolution as true are generally of a higher moral calibre than those who believe in the "theory" of creationism. Funny how that works...
    • I can't believe the numbers that bit on this. Very funny.
  • the review will answer the question... ? =D
  • Software modeled on biological systems, we are told, will, "detect crime for us, identify faults, ... design new products for us, create art, and compose music."

    It will also take out the trash, make your bed, screen calls from your annoying ex-girlfriend, make sure your milk is still good, tell you you're looking skinnier, and reprogram your TV to get all the good channels.

    • With a little bit of ingenuity and good old-fashioned coding you can already make a box screen calls from your ex (caller id), monitor expiry dates (simple database, possibly + barcode reader), and 'reprogram' your TV (TiVO?) Add to that a complicated mess of actuators and hydraulics (robotic arms) and it can easily make your bed and take out the trash. With a soundcard you can make the computer say anything you ask it to, and leaps and bounds have been made in the field of simulated AI chat programs.


  • At least to some programmers....

    Writing a living, breathing program would be the goal of many of us, not just AI programmers
  • This is one of the courses [cs.vu.nl] I followed at the university, during my artificial intelligence study.
    Lots of examples of this book, came back in our practicums, there are nice links [cs.vu.nl] to sites about this subject on the page and also the complete course is online for you to download (not sure if my professor is going to be happy about this, but who cares, I passed the grade:)
  • To the worst extent possible.

    My psych professor explained our language lecture using layman's computer terminology, instead of psychology. I wanted to strangle him the entire time. "So... you've got this memory stuff... and it get accessed - that is - processed, by this other bit over here, right, this area of the brain... let's just call that the "software"."

    It was enough to make any techie of any note sick. He actually used Microsoft as a language. Talk about wanting to shoot someone.

    But what can we do? Everyone thinks they're a programmer or a techie these days, and everyone thinks that because kids use IM they must have some other association with the grey box.

    Sorry fellers, that's wrong. Most kids today don't know jack about computing, much less are able to relate better when you babble incessantly about things in your half-tech, half-psychologist manner. Stick to the psychology or the biology, instead of using computer terms to explain simple concepts. It's just more confusing and more hellish. :(
    • My psych professor explained our language lecture using layman's computer terminology, instead of psychology.

      My understanding is that psychology has always chased the latest technology in its efforts to explain the mechanics of the brain. The brain has been compared to steam engines and grain mills, in their time (so I hear).

      The biggest irony is when psychologists describe the brain as a neural network (the kind that's been modeled in computers), because the origin of idea for the neural network was the workings of neurons in the brain!

      For this reason, many people insist that computer neural networks should be called artificial neural networks. Indeed, the artificial neural network is an interesting mathematical algorithm that takes its inspiration from "real" neural networks. It was never meant to be a model of the human brain by any stretch of the imagination.

      You have to admit, though, the analogies are getting better. The brain is definitely more like a computer than a steam engine.

      • You have to admit, though, the analogies are getting better. The brain is definitely more like a computer than a steam engine.

        Okay, I'll bite. I don't think that's the case at all. Steam engines take in stored energy, release it , and move down a track. So do humans. The cells taken in ATP, release it, and move down some track. We don't know much about how these turn into decisions about whom to marry, which beer to drink, or how to mix the two together, but we know that energy is going in, and decisions are coming out.
        A computer, on the other hand, is filled with logical gates that make straight-forward, well-defined decisions like AND, OR, or NOR. I hate to remind you, but there are many people that don't seem to have any connection with logic. They're really out to lunch. But they do take in energy and move down some track.
        So just for the sake of argument, I think that the computer metaphor is moving in the wrong direction. Your track isn't pointed in the right way. You took in that energy, but it's not helping us at all.
        • Far be it for me not to bite back. :-)

          Steam engines take in stored energy, release it , and move down a track. [...] The cells taken in ATP, release it, and move down some track.

          Well, I think that's a rather superficial similarity, and it's not quite comparing apples to apples. If the brain converted sugars into mechanical energy and chugged its way along the spine, I would be more inclined to agree with you.

          A computer, on the other hand, is filled with logical gates that make straight-forward, well-defined decisions like AND, OR, or NOR. I hate to remind you, but there are many people that don't seem to have any connection with logic.

          Just because the base components are logic gates doesn't mean that the final output of the whole system needs to be logical. Why, there are probably huge numbers of people who would already describe computers as being unpredictable, irrational, and self-destructive... and this is when they weren't even programmed to be that way! (Okay, that's a half-joke.) (But only half.)

          So just for the sake of argument, I think that the computer metaphor is moving in the wrong direction. Your track isn't pointed in the right way. You took in that energy, but it's not helping us at all.

          I'll have to agree that the computer analogy doesn't help that much, and it's a point of argument whether it gives us any more insight into human psychology than a steam engine metaphor.

          However, I didn't say the analogy was good; I just said it's better. And I would maintain that the computer, which, after all, helps us to make decisions (with varying degrees of perceived and real effectiveness), is closer to the brain than a steam engine, whose purpose is not related to decision making.

      • Yes. DV is right. For instance, naturall Philosophers compared thought to a system of air pumps when air-pumps were new (can't find a reference). You can also see the constellation "Air-Pump" [dibonsmith.com] in the southern hemisphere as a result of the awe generated by this new technology.

        Later, about the time of Ben Franklin and Mary Shelley, they began to talk about thought as electricity. This really was a lot closer.

        After another few decades, your brain became an internal telegraph-and-railroad system, and then a telephone exchange; and that brings us up to the era of the Giant Computing Machines that have afflicted the analogies of non-techies for the last 50 years.
    • by Anonymous Coward on Monday March 11, 2002 @12:11PM (#3143211)
      As a neuroscientist and former CS major in college (and long time Slashdot reader) I can also assert that programmers abuse biology metaphors just as badly!

      I'm tired of the comprarison of viruses to computer viruses, as well as DNA to computer code. Everytime an article on neural/silicon interactions comes on - here come the stupid Neuromancer "jacking-in" references! Every time a genetic engineering article comes here, people whip out "Jurassic Park" and "Chaos theory" to explain why they don't consider GM a good idea!
      Mixing some computer and biological metaphors on a very BASE level has its uses, but people on both sides all too often become overly enamoured with these simple comparisons and forget the very REAL and often subtle differences that invalidate the metaphors.

      A lot of coders I've met need to learn as much REAL (not popular) biology as the biologists you are complaining about need to learn about computers! Basically - I thought your comment was more than a bit one-sided and somewhat condescending - knowing alot about bits, pointers and registers doesn't make you any more qualified to mix metaphors than knowing alot about neurons, genes, and molecules does!

      Sincerely,
      Kevin Christie
      Neuroscience Program
      University of Illinois at Urbana-Champaign
      crispiewm@hotmail.com
      • by Pentagram ( 40862 ) on Monday March 11, 2002 @12:55PM (#3143518) Homepage
        I'm tired of the comprarison of viruses to computer viruses, as well as DNA to computer code.

        Excuse me? Surely both examples you give are excellent analogies of each other. Viruses parasitically use the machinery of their hosts to spread themselves... and so do computer viruses (well, worms at least.)

        And DNA is a digital series of instructions that are interpreted to express something... and so is computer code. Has anyone proved you can build a Turing machine in DNA yet? Admittedly, DNA is processed in a rather more analogue fashion than most computer code, but as an an analogy, it's better than most; (for example) the old one about breaking computer systems/ breaking into a house.
        • On some superficial level computer virus and DNA code analogies are good, but the problem is when people take them too far. What makes these analogies bad is that you don't have to take them very far to go too far. The computer virus analogy, to have any value, would require that programs are like cells, for example. They're not, at least in the important ways. The DNA code analogy is a very, very dangerous one that is emerging. Perhaps the best comparison I can make is to the brain-computer analogy. There are very few things more damaging to understanding of the brain than this intuitively appealing piece of idiocy. The DNA code analogy is already starting to lead people astray--I've seen papers from CS people making evolutionary arguments about coding in base 2 vs base 4 (ACTG = 4 bases) and that kind of thing, but the arguments are completely irrelevant because they take one detail and ignore everything else of what's known about the biochemistry of nucleic acids and the molecular biology of cells. Bad analogies are dangerous because they poison the way people think. Unfortunately, since biological systems are invariably orders of magnitude more complex than the systems they're being compared to, it's usually the conceptual understanding of the biological issues that suffers and not the other way around.
        • DNA and computer code have one very important thing in common -- 90% of the code is junk!
        • Surely both examples you give are excellent analogies of each other.Viruses parasitically use the machinery of their hosts to spread themselves... and so do computer viruses (well, worms at least.)

          Computer viruses do not physically dis-assemble the host computer (or its OS), chopping it up into pieces that are re-assembled to from new infections computer viruses. A big difference.

          And DNA is a digital series of instructions that are interpreted to express something... and so is computer code.

          Digital series indeed!! DNA is more like a recipe as S.J.Gould and others never tire of pointing out (evidently for a good reason). If it were a program it would be the buggiest, crappiest program that had been maintained for years with different compatability layers added to it. If it were a program it would be as though COBOL had been kept and had new libraries added to it, some of which worked sort-of, and others were completely b0rken.

          The point of this is that the analogies/metaphors/comparisons are not really useful beyond a simple level. Interesting analogies or metaphors are ones that reveal _unexpected_ details about the analogised subject. The code/DNA one does not. It is just two cool things lumped together with superficial similarity.

          • Digital series indeed!! DNA is more like a recipe as S.J.Gould and others never tire of pointing out (evidently for a good reason).

            I'm sorry, DNA can obviously be percieved as a digital sequence. There are four distinct states encoding the information. ("recipe", whatever) I hope it's clear that it's not analog at least.

            And as for recipe vs. program, they're the same thing! A sequence of instructions describing how to perform some action. Computer code is usually laid out in a more deliberate and structured form because the "operator" is so much simpler and more precise, but that doesn't really change the core nature of the thing. In the case of DNA, things are different yet again.

            Of course if you don't understand anything about a recipe, comparing recipies and computer code is useless as well. The analogy to computer code is valid, but not perfect. You can't really use it unless you know where it doesn't work. Just because you can over-extend it doesn't mean it's complete trash.

            • I'm sorry, DNA can obviously be percieved as a digital sequence. There are four distinct states encoding the information. ("recipe", whatever) I hope it's clear that it's not analog at least.

              This statement is formally correct, but highly misleading. The whole digital vs. analog paradigm implies that DNA simply contains a signal as a function of position, but this is not the case. It's an actual molecule in the real world; the conformation of the molecule matters for important things like transcriptional regulation. This is more easily illustrated with protein sequences. They are also "digital," with twenty states instead of four. However, the behavior of protein molecules of known sequence is not ab initio predictable in practice for sequences of any useful length.
              • However, the behavior of protein molecules of known sequence is not ab initio predictable in practice for sequences of any useful length.

                Take a ~1 MB (source code) computer program, written in a language you don't understand. Try predicting what it will do without actually compiling and running it. Same problem...so I'm not surprised this is the case. OTOH, very small sequences can be predicted, it's just that the sizes which we can predict don't happen to be usably long.
              • the behavior of protein molecules of known sequence is not ab initio predictable in practice for sequences of any useful length.

                Exactly. Even if one is able to use a method like threading, or multiple-alignment to find similarities or shared motifs the behaviour of the protein depends upon the complement of other molecules in its vicinity. If anything the whole deal is much closer to an analog system than a digital one.

        • Actually, computer viruses are programs that were written by someone with intent. Real viruses and their ancestors evolved from nothing (read: non-organic matter) and are an instrinsic feature of life on this planet...unless a computer viruses spontaneously organizes itself out of nowhere and spreads throughout the networks, the anaology really isn't that tight. My linux box at least isn't running around, writing its own code yet.
        • >Has anyone proved you can build a Turing machine in DNA yet?

          Depends on what you mean by 'building a Turing Machine in DNA'. Your question is rather analogous to saying 'has anyone proved you can build a Turing machine with SDRAM yet?'. DNA is a storage medium and requires external mechanisms to operate on it for computation.

          If you take DNA and the DNA-processing mechanisms from stichotrichous ciliates, then yes, you *can* build a Turing Machine. See:

          Reversable Molecular Computation in Ciliates. Kari, L., Kari, J., Landweber, L.F., _Jewels are Forever_ (Karhumaki, J. et. al eds.), Springer-Verlag, 1999, pp 353--363

          Thats particularly interesting because its _in vivo_ computing, but there are also tens (probably hundreds) of proposals for _in vitro_ DNA computing. Do a google search on 'DNA Computing' and look for the proceedings of the International conferences on DNA-based Computers (I believe the one this summer in Japan is number 8)... IIRC they're published under a DIMACS series. If you want a 'canonical' paper for _in vitro_ methods, I guess starting at the beginning would be the best:

          Adleman, L., "Molecular Computation of Solutions to Combinatorial Problems," Science, Vol. 266, 11 November 1994, pp. 1021-1023.
      • [quote]
        As a neuroscientist and former CS major in college (and long time Slashdot reader) I can also assert that programmers abuse biology metaphors just as badly!
        [/quote]

        amen!

        i was very amused for example by the incorrect parrallel drawn by linus & friends on the LKML lately, regarding linux development as an evolutionary process. linux development is directed by a (group of) person(s), which can hardly be compared to the way nature randomly applies selective pressure onto a living organism.

        i could take this argument further, but i don't want to be filtered out as a page-lengthening post ;-)

        nevertheless, the parallel was instrumental in the sense that it got the discussion going. and let's face it, this particular discussion even made it to slashdot, so it must have been important... ;p
      • Tired of the comparisons? You should be, for the very reasons you sight. There *are* comparisons, but they have been missed by almost everyone, biologists and programmers alike.

        DNA isn't code, because DNA doesn't *do* anything. DNA defines the total set of processes that can be expressed in a cell for an organism. This can be seen by the fact that the roughly 200 cell types in a human only express 10 percent or less of the available DNA.

        Thus DNA is the definitions avaiable for a system. Just like that pile of installation disks that an MIS department holds for configuring a computer system.

        When DNA is expressed, it results in RNA (to really simplify the transcription process). RNA is the "program store" for a cell. RNA defines the processes that are active in a cell. This isn't at all unlike the programs you have installed on your computer system. What processes a cell will perform and its configuration is driven by the particular genes that have been expressed (copied into RNA) for that cell. These are selected in such a way that the cell can effectively perform its role.

        Yet actually carrying out these processes requires yet another step, the translation of RNA into the proteins that drive biological processes. Just like loading a program into memory and executing it is required to have something happen in a computer, building proteins is required to make a process occur in a cell.

        A program store serves a very useful purpose. Both in computer systems and biological cells, it allows a cell to be instantly ready to perform the tasks required of it. If another protein is needed, the RNA is ready and able to translate another copy. Or on a computer system, if you need to run that spreadsheet, that program is ready to be load another copy in to memory for your use.

        The Central Dogma of Molecular Biology states:

        DNA -> RNA -> Proteins

        More generally, this can be understood as:

        definition -> expression -> function

        What we need to learn to do in computer systems is understand how the architecture in biological systems results in self configuring distributed processes. We can do it the hard way (where we fight the mechanical metaphors such that every positive step to more configurable computer systems requires a non-intutive step against the mechanical metaphor), or we can do it the easy way (where we recognize that biological systems have already *solved* the configuration and distribution problems).

        The really amazing thing to me about biological systems and computer systems is how long this simple observation has escaped everyone. The mixed-up metaphors (like Microsoft's DNA) are simply painful.

        --Paul
        http://groups.yahoo.com/group/SoftwareGe netics
    • Even worse the metaphor begins to fall apart fairly rapidly the moment you poke deeper than the metaphor's skin.

      Metaphor is a difficult thing, because you are attempting to illustrate an independant reality using something that is familiar to the listener. The danger is that the listener may mistake the analogy for formal equivalency. I tend to open my lectures with bold metaphor (the brain is a swiss army knife...)to try to grab students attention and give them a conceptual framework, and then often spend the rest of the lecture (term?) fighting this phantom notion of formal equivalency (no a ligand is not a key, and a receptor is not a lock). Formal concepts are difficult to absorb, just think of metaphor as the vehicle that carries ideas (whoops there I go again). Of course if your professor does not pick apart his/her own metaphor by the end of the term, then perhaps he/she is doing a disservice to the less engaged students.

      What I find really fascinating is how teaching and general discussion is limited by metaphor. When discussing things among collegues there is little elaboration of many things because they grok the independant reality of a phenomena as well as I do (probably better). The metaphors come out when you need to discuss/explain something to the less-expert. I don't think that anyone ever believed that working memory was a "scratch-pad" or that long-term memory was a "tape-recorder". While many of my collegues might argue for the brain as universal-Turing-machine in the formal conception, I don't think any of us believe that the brain is a computer like the one on your desk (we are analog). All of us in our technical capacities just lack the language to express our ideas to the less technically adept.
    • Welcome to the hell that we physicists have been dealing with when philosophers start talking about the Heisenberg Uncertainty Principle. Trying to explain that it doesn't mean "nothing is true" is a bit like explaining the entire history of the East India company to a tea leaf.
  • > Scientists will love the lack of distraction, but casual readers looking for colorful anecdotes about the wacky geniuses in labcoats will need to look elsewhere.

    If you want that kind of thing, this book is amazing for presenting both sides (ie, the science & the people) of the stories:

    http://www.amazon.com/exec/obidos/ASIN/067187234 6/ qid=1015865367/sr=8-1/ref=sr_8_67_1/103-9949968-63 91849

    It's called Complexity. It is a kind of answer to 'Chaos', and it has much info on the kind of biological software that the Santa Fe Institute crowd was working on a few years ago. A very highly recommended read.
  • by Bonker ( 243350 ) on Monday March 11, 2002 @12:00PM (#3143154)
    Humans have a tendancy to cast biological, and even human, behaviors on anything that is outside their ken.

    Case in point. When I was helping my mother restore her computer after she was infected with Code Red, she was infuriated at the worm. While she is a computer professional, she is not a coder and has no understanding of... say... how machine code executes a loop or a goto. She talked about Code Red as if it really was a living thing despite the fact that she knew better. One of the things she said that stuck in my head was 'Why would it do that to me?'
    • by Anonymous Coward
      We do this to dogs, cats, and animals too. We think of them as little humans with little human feelings. I'm sure concepts like pain or hunger apply to many mammals, but I'm not so sure about heartache and suffering. My cat is sure fussy, but I'm not sure if bitchy is the right word. Maybe it's just a human concept?

      Can we get beyond this? I don't think so. We're humans, after all. We only know human things. Maybe licking your fur all day changes your perception of the world. Maybe sniffing butts changes the mind of dogs. Projecting human thoughts may not just be the best way to try to understand this, it may be the only way. (Well, save from licking yourself all day and sniffing butts.)

  • I get the authors point, but I don't think he makes it well..

    No one speaks of subroutines that cp themselves through undocumented remote procedure calls because talk of 'computer viruses' carries all of the portent and weight of polio, anthrax, German Measles and tuberculosis.

    Yeah, no-one speaks of the exact way all these illness viruses work either since it's easy to abstract it out to a simple term 'virus'/
  • IIRC it happened a year ago, upon the implementation of RFC 1149 [linux.no].
  • can anyone navigate the grammatical maze that is this sentence!?:
    ---No one speaks of subroutines that cp themselves through undocumented remote procedure calls because talk of 'computer viruses' carries all of the portent and weight of polio, anthrax, German Measles and tuberculosis.---
    • duh, brain fart! i get it now.... the thing that screwed me up was "...cp themselves through...", it didn't even occur to me that cp was the command for copy, so i was thinking of that as a noun, and the word "through" was just sort of hanging out there by itself then.... thanks
  • by westfirst ( 222247 ) on Monday March 11, 2002 @12:20PM (#3143262)
    http://www.digitalbiology.com/ [digitalbiology.com]

    Plenty of good stuff. Anyone have other good links?
    • This one [red3d.com] is my favorite. You can watch the flocking boids, and it explains flocking algorithms very clearly and easily.

      Plus, it's got links to about 50 other really interesting biological modeling and application sites.

  • by Yet Another Smith ( 42377 ) on Monday March 11, 2002 @12:21PM (#3143269)
    I'm a little worried that if this gets too 'faddy' that people could start looking for biological metaphors and ignore other eqeually effective, or perhaps more effective solutions.

    For example, from the review above:

    genetic algorithms may find patterns of credit card fraud and help us find better jet turbine blades

    The genetic algorithm is a great algorithm for optimization problems. However, its not significantly more effective than the simulated annealing [sciam.com] algorithm or the less-known controlled random search [dl.ac.uk] algorithm.

    Each has its advantages and disadvantages, but getting too caught up in the metaphors these algorithms and techniques are based on will unnecessarily shackle your thinking. Of course, the opposite is also true. Refusing to embrace metaphors at all will leave you without the insights that we use metaphors to see, so don't take me too seriously :).
    • One comment that complements the previous post is that it is important to distinguish between computational biology and what the book is calling "digital biology." Computational biology is doing biology with computer models. "Digital biology" is doing CS using certain techniques inspired by biological systems. That does not make it biology. The book would be more appropriately titled "biological digitalism."
    • I'm familiar with genetic algorithms and simulated annealing, but not with controlled random search. I read the link you gave for it, but it didn't really say how it differed from a genetic algorithm. I spent a while searching on google but still couldn't find the difference.

      Could you post a better link, or explain the difference yourself? Thanx.

      -
      • sorry, that was the best link I found in a quick google search of my own :P Its not used as much as GA and SA. I'll try to explain, although its hard without pictures.

        The CRS is based on the 'Nelder-Mead' simplex method. Here's a better description of that. [enst-bretagne.fr]

        It starts with n + 1 points where n is the number of parameters you're optimizing. That's 3 points forming a triangle in 2D space or 4 points forming a tetrahedron in 3D space (the space being the values you're optimizing). Its easiest to think of the 2D situation with the triangle.

        Each corner of the triangle is some set of parameters, each of which will have a different 'fitness'. The fitness is the value that you're trying to minimize. Evaluate the fitness at each vertex of teh triangle. Take the largest "least fit" vertex, and 'step the triangle downhill' by reflecting it through the midpoint between the other two points.

        This should create a reflected triangle closer to the fitness minimum that you are trying to find. repeat until you get so close to the minimum that you're going in circles.

        Now, with the CRS, you use the simplex to take all your steps, but in this case you create a large pool of initial candidates at random, just like you do for the Genetic or Simulated Annealing algorithms. The you create a new simplex by selecting n+1 elements from your initial pool. Step the simplex downhill, and see if your new value is better. IF so, throw away the worst element of the initial population, and replace it with the new one. Then select a new simplex at random from your pool of candidates and repeat the procedure.

        This way, you're always producing random steps, so you can't easily get caught in a local minimum, and its a pretty efficient solution. It works well with linear constraints, which seemed to be an advantage over GA and SA when I was working on this myself. I should put a discaimer on here that I'm a geophysicist, not a computer scientist, so I may not use the lingo the same way your average /.er would.

        Hope that wasn't too confusing. I'm trying to write this without my boss knowing I'm not working >:)
        • Ah, thanx for your explanation. With some thought I was able to picture exactly what you were trying to describe.

          It seems like it would only work on a well-behaved search space. SA and CRS are faster with some search spaces, but I still preffer GA. It can attack any problem that SA and CRS can, plus it works on ugly search spaces. In particular CRS and SA seem almost usless for creating a program, one of the coolest tasks for GA.

          -
  • by Mannerism ( 188292 ) <keith-slashdotNO@SPAMspotsoftware.com> on Monday March 11, 2002 @12:59PM (#3143548)
    This area has always interested me because I did my undergraduate degree in molecular biology, and my professional career has been in software engineering.

    The first thing that strikes me when biology and computer science are brought together is that although we try to apply principles of the former to the latter, we really have a much firmer grasp of computer science than we do of biology. What we're really doing, I think, is taking some theories and concepts from biology -- evolution and immunology seem to be the big ones -- and adapting those theories to suit digital computers; we're not modelling life per se. It's important to remember, too, that although we can model evolutionary processes like variation and selection in a computer system and produce the anticipated results, we can't thereby prove that evolution applies to life. (I happen to believe that it does, but I have to admit that we have yet to irrefutably prove it). All we're doing is nicely illustrating the theory.

    Someone mentioned earlier that everyone claims to be some sort of computer expert these days, and that biologists and psychologists routinely misapply computer concepts. From my perspective, the reverse is true. There seems to be a misconception that biology is straightforward and well-understood, and I just don't know where that comes from. I'm sure I'm not the only biologist who grimaces when "virus" is used to describe software. And the most gaping errors in science fiction always seem to be ones of biology. Computer scientists use words like "genotype" and "phenotype", but genetic algorithms seem to me to be so far removed from the actual complexities of gene expression as to be at best distant cousins. It's more a matter of biology lending ideas and inspiration to computer science than it is some direct translation of life processes to software processes.
    • It's important to remember, too, that although we can model evolutionary processes like variation and selection in a computer system and produce the anticipated results, we can't thereby prove that evolution applies to life. (I happen to believe that it does, but I have to admit that we have yet to irrefutably prove it). All we're doing is nicely illustrating the theory.

      But "illustrating the theory" is really everything that ever can be done in science. In math (and CS is really applied math, despite the name) things can be proved. In science, theories can never be proved because it is always possible that someone tomorrow could perform an experiment disproving the theory.

      • I'm not so sure that you cannot prove things in biology as easily as heartily as your prove them in mathematics. Evolution is a triusm. A truism in the sense of philosophy and math is just symbolic philosophy.

        Think about it: "Survival of the fittest" That means survival of those that able to survive.

        You can't argue with this.

        • Think about it: "Survival of the fittest" That means survival of those that able to survive.

          Actually, I revise this for my own personal reference as: "Destruction of the unfit", since the "Survival of the fittest" implies that ONLY the fittest will survive. Your general point, however, is still perfectly valid :-)

  • Anybody ever notice that the kernel is the core part of a UNIX system, similar to the kernel of an atom, and that the shell is the outer part of each? In each instance the shell is changable, and also the 'interface', as it were, to the whole unit. Seems like this might be a good example of somebody getting cross-field terminology right.
  • I think the reviewer's points about differential equations are good ones. If you check out the link in the bio, one article on AIDS points to the mistakes you can make with choosing the wrong rules for your "metaphors." Differential equations, for instance, require you to be able to take derivatives of the functions and fit the derivatives to some equation. That's great if the functions have derivatives, but it can be misleading if they don't. In one example, an economist hell-bent on using differential equations decides that AIDS can be curtailed if we all have more sex. (I kid you not!).
    So are biological metaphors just as suspect? Perhaps. Digital evolution is cool, but I don't see why it is better than any of the other optimization techniques. If anything, the digital bio metaphor forces you to mimic creatures and all of their semi-monogamous, one-on-one reproduction. Equations don't have to conform to such a binary vision.
  • The fundamental reason why we cannot model biology with computers is that biological systems are chaotic. They respond to extremely small fluctuations no floating point processor can handle. In fact the human eye can respond to a single photon. See: http://math.ucr.edu/home/baez/physics/ParticleAndN uclear/see_a_photon.html

    Biological systems are sensitive on quantum level and computers certainly cannot be.
    • Never say never. When you really get down to it, the only thing that allows organisms to register a single photon is that photon tripping some chromophore into a different conformation. So it's just one particle making an atom switch, thereby making an amino acid twist, thereby making an entire protein move. The subsequent amplification all rely on principles of signal transduction.

      So computers must be able to measure single photons, otherwise how did the physicists know that they were emitting a single photon? And to go from single-photon-detection to whole-organism-response only requires a long series of amplification cascades. Why is such a setup so hard to envision in a computer system?
      • Yes, and no. One could design a computer system (electro-optical system) to measure a single photon. and we (not me pesonally) have. The single photon example was simply an example of how sensitive biological systems are to just about everything. To reproduce them, the computer system would have to be as sensitive, and thus would have to be analog for a start. Secondly, any, repeat, any rounding off error would result in different behaviour. I am not saying that we cannot produce a system as complicated as a biological one in a computer, I'm saying we cannot replicate the biological one to any degree of accuracy.
        • Well, I'd personally say that point is arguable. Let's say we're making a system that is capable of responses as complicated as those exhibited as a cell, for example. It's really just a matter of adding in billions of responses. So we'd model a receptor-ligand system by saying "if stimulus X in Y amount then trigger Z." Things get really complicated when you stipulate that stimulus X in Y amount will ONLY trigger Z if your "receptor" is present in the right amount.

          When you really get down to it, most biological processes aren't analog. Instead, they're regulated by molecules that can take on a finite number of states. Given, the number of molecules involved is fantastically large, and the number of states they can take is almost always more than 2 (especially since you have to take the effect of things like protein misfolding due to mutation into account).

          So yes, it's relatively simple (heh) to produce a computer-based system that's as complicated as a biological one. But to replicate a biological system we'd have to know every X molecule, and all of the resultant Z triggers that can result from Y concentration of X. Then, we'd have to already know how all of the different X molecules connect to eachother (in ways as subtle as "you can't make any more X1 because all of the zinc was used to make X2").

          However, while we can't replicate biological systems (and probably never will be able to), we certainly can model them. This is much easier, since we interweave a bunch of different functions in an attempt to arrive at something that generally makes sense. Then try to model some situations where the result is already known. If your model matches reality in almost every case, then you've probably got a winner. Otherwise, Do Not Pass Go.
  • What's wrong with the liberal use of a metaphor here and there? The people we're talking about here (biologists and technologists) aren't idiots...they're highly trained and intelligent individuals. As such, most of them can tell when a metaphor is being taken too far.

    Let's say I'm trying to explain a concept in molecular biology to a computer scientist. Is it really so bad if I make an analogy connecting something the computer scientist already knows (programming, for example) and something he or she does not know (MAPK pathways, for example)? As long as the analogy holds up on the level that I explain it at, things should work fine.

    But because neither the computer scientist nor the biologist are stupid, they won't take the analogy too far. The computer scientist won't immediately think, "I bet obscure programming fact XXXX holds for this biological system he's explaining to me, because he just used programming language YYYY in his metaphor." This won't happen because the computer scientist is a rational person, who knows what a metaphor is and its probable limits.

    Yes, it's true that if everyone takes metaphors literally, then we'll run into problems. But the entire reason we can use metaphors for something useful, is that we can also also understand that a metaphor can break down at some point.

    I'll admit, I get pissed when popular culture misquotes some arcane (or even general) biological principle. However, that's a totally different thing than using some other subject as a metaphor. Without metaphors, those involved would have to learn these things from scratch, without drawing upon what one already understands. I think it's totally valid to dispense snippets of information through metaphor, since the alternative is working one's way up from ground zero without using metaphor. And that's way too much to ask, considering in biology it takes a PhD for anyone to consider you above zero level.
  • Folks interested in the book might also be interested in a letter to the editor published in the latest issue of the Centers for Disease Control and Prevention (CDC) journal Emerging Infectious Diseases. The journal is a scholarly source about biological diseases. The letter, Contagion on the Internet [cdc.gov], compares the biology and evolution of biological viruses to computer viruses.
  • Two of the four diseases the author mentions, ie anthrax and tuberculosis, are caused by bacteria and not viruses.
  • talk of 'computer viruses' carries all of the portent and weight of polio, anthrax, German Measles and tuberculosis. Invoking these mysterious and deadly images is more colorful than tech speak, even if most of the so-called viruses are closer to the common cold than the black plague.

    viruses named:
    polio
    common cold
    German Measles

    bacteria passed off as viruses:
    anthrax
    tuberculosis
    black plague

    *sigh*.

This is now. Later is later.

Working...