Convergence of Biology and Computers? 388
Pankaj Arora asks: "This summer I am working on both Bioinformatics and Molecular Biology research projects at the Mayo Clinic Rochester. Being an MIS major with a heavy CS background, I've been learning about biochemistry performing polymerase chain reactions (PCRs) and RNA retranslation among other things. I've learned biology works a lot like computers; binary has 1s and 0s, DNA has nucleotides: A, T, C, and G. Binary has 8 bits to a byte, DNA has 3 nucleotides to a codon. Computers and biology seem to have a natural fit; information is encoded and represented 'digitally' in a sense. I was wondering what people thought about the future of biology-based and genetics-based computing due to the immense efficiencies that lie in nature. This has been discussed to an extent here, but there were some specific aspects that I feel are quite important and were not discussed thoroughly, thus I have a few questions to pose to the Slashdot community."
"The aspects I would like discussed are as follows:
- In the long run, will biology rewrite computing or will modern day technology concepts and theory be integrated into biology? If both are true, which will have the greater effect? I understand long run is ambiguous in this question, but Iâ(TM)m interested in all thoughts using any applicable definition.
- Tied to the first question: How will the nature of computing, and how we perceive it, change due to biology integration? More to the point, how much of the theory we learn today may change?
- What will be the biggest issue determining the success of the adoption of biology-integrated computing? Will it be technology factors or will it be societal factors (e.g., rebellion by the Right Wing), or something else? What things must hold true to make the idea succeed?
- And perhaps the hottest issue of all: Is there anything inherently wrong with pursuing this avenue? What may be some of the consequences?
On point one, a happy medium. (Score:3, Insightful)
What I say (Score:3, Funny)
Modern day technology concepts and biology will both one day become so advanced that they are are... indistinguishable.
Re:What I say (Score:2, Insightful)
Yes it will be called _Physics_
Speaking as a cyborg (Score:5, Interesting)
Re:Speaking as a cyborg (Score:2, Interesting)
Re:Speaking as a cyborg (Score:2, Informative)
Re:Speaking as a cyborg (Score:2, Interesting)
Re:Speaking as a cyborg (Score:2)
And then there are the 'active' ones. Photogray, which change light transmission depending on ambient conditions.
BAsically, thty are an enhancement/replacement for the lenses in your eye.
equation (Score:5, Interesting)
binary + DNA = phi
(try and figure that one out
Re:equation (Score:2, Funny)
binary + DNA = pi
Somehow, I envisioned a future where we all took electric drills to our heads.
Re:equation (Score:2)
Re:equation (Score:5, Informative)
Re:equation (Score:3, Interesting)
If binary makes so much sense for representing information and doing useful work with it, why is it that the fundamental building block in our body uses four base pairs
Re:equation (Score:5, Informative)
Binary isn't THAT efficient if you want to store information in a small space. Quaternary systems (like DNA) are more efficient space-wise.
(Simplifying wildly) DNA stores 3 base pair 'words' called codons. Each codon either codes for an amino acid (each amino acid is coded by more than one codon) - such as the sequence ATG which codes for the acid methionine; or represents a 'start gene' or 'stop gene' switch.
With three letter sequences for a codon and four possible letters for each position you end up with 64 possible codons (there are just 20 amino acids); to store the same amount in binary you would need six bases.
So DNA is actually very efficient at what it does.
Best wishes,
Mike.
Speed (Score:4, Interesting)
Re:Speed (Score:5, Interesting)
So what biological computing has to offer in speed is basically countered by the difficulty in gaining access to the information, unless MAJOR advances are made. And for simple math-type computing problems, biological processing would probably never catch up to what we have now in electronic computers.
Just my 2 bits worth.
DNA Computers (Score:4, Informative)
Simple Guide to DNA Computers [dna2z.com]
How Stuff Works - DNA Computers [howstuffworks.com]
No ground breaking crypto solving or Beowulfs yet but some solid calculations going on.
Re:Speed (Score:2)
Gain speed, loose space (Score:4, Informative)
It was Leonard Adleman (of RSA fame) who first proposed the idea of using DNA to perform simple computations in a 1994 paper entitled "Molecular computation of solutions to combinatorial problems" (you can find it here [usc.edu].
Adleman's DNA computer computed the answer to the Hamiltonian Path problem for a small graph. The Hamiltonian Path problem is solvable on a conventional computer, however it is an "NP-Complete" problem, which means that all known deterministic algorithims have a running time which is exponential with respect to the problem size.
Adleman's solution was to encode random paths through the graph in billions of DNA strands, then use custom engineered enzymes to eliminate those strands that were not a Hamiltonian path. Essenially, he simulated a non-deterministic machine through massive parallelism.
While this is increadibly clever, and very interesting, it isn't necissarily practical; at least, not for NP-complete problems. Adleman acheived linear execution time for an NP-complete problem, but he did so at the expense of requiring an exponential number of DNA strands with respect to his problem size. A small graph with only a few hundred nodes would require more strands of DNA than there are atoms in the universe.
This is not to say that DNA computers are of purely academic interest; Adleman's computer was merely a "proof of concept". I'm sure there exist problems in P which would benefit immensely from massively parallel computing. It's just a question of finding problems which are cost effective to implement.
Perhaps many of these "distributed" computing efforts that are underway now would better be served by a DNA computer.
GEB chats all about the overlaps (Score:5, Interesting)
Re:GEB chats all about the overlaps (Score:2)
Re:GEB chats all about the overlaps (Score:4, Funny)
Re:GEB chats all about the overlaps (Score:2)
Wow ! (Score:5, Funny)
You must be new around here...
45 of 245 reporting in (Score:3, Funny)
If we can finally assimilate that pesky planet at sector 001, then we will consider ourselves to be a success.
Re:45 of 245 reporting in (Score:3, Funny)
I didn't know that Star Trek was translated into 1337 until I saw this. I looked it up, and it seems that the whole star trek series was origionaly written in 1337, and only later translated. Here is an example:
The scene: On the bridge of the enterprise as they battle the borg
Data: OMG!! their firing on us!!!!11
Picard: w0rf, AWP they're ass
Worf: OMG!!! thye h4v3 a sheild hack...3y3 can't get through
Picard: i thought they fixed that in teh last patch.
Crusher: we must
Already there (Score:5, Informative)
Well anyways, the travelling salesman problem was solved using specially crafted DNA sequences.
The problem with DNA computing (Score:3, Interesting)
It's still a practical application, despite the trivialness of it.
Yes, maybe a travelling salesperson problem with something on the order of a million possible answers would be solvable using DNA. Right now, it's probably 100 times more capable (speed- and memory-wise) than our conventional computers.
However, DNA doesn't get any smaller or more efficient. It simply cannot advance. As problems get more complex the margin of error gets too large to ignore, and reactions take too long. In the long run (10-
The convergence has already happened (Score:4, Funny)
Mmmm.. (Score:4, Funny)
I got a big codon while I was reading the linux kernel source.
Two distinct fields (Score:5, Insightful)
There's using computing to forward and analyize biological questions, that's one field. (and the one I'm in)
The other is using biology to build things like nanotech and other molecular circuitry.
Both of these are using one as a tool to forward the other, it's not a straight integration like putting chocolate and peanut butter together, and never will be.
Each field will simply adapt and use tools from other fields. Just as in molecular biology physics and chemistry concepts are used to help understand biological mechanisms. Don't look for a Unifying Theory for all these fields.
Anyhow, that's my opinion, my boss will probably say I'm completely wrong
Re:Two distinct fields (Score:2)
Re:Two distinct fields (Score:5, Insightful)
Both of these are using one as a tool to forward the other, it's not a straight integration like putting chocolate and peanut butter together, and never will be.
Each field will simply adapt and use tools from other fields. Just as in molecular biology physics and chemistry concepts are used to help understand biological mechanisms. Don't look for a Unifying Theory for all these fields.
That seems rather short-sighted....never is an awful long time.
Maybe not a Unifying Theory, but a blurring of the lines until they no longer exist? I think so.
For instance, one current bleeding edge, analyzing genomes, hasn't yet resulted in a lot of building of completely new gene structures. Because we don't understand them very well, and because our tools for assembling genes and creating new organisms based on our created genes are still very crude.
Jump forward 50 years (or 150 years, or a thousand years!), and I'm willing to bet that won't be the case any more.
At that point, I think you will see a complete meshing of information technology and biology.
And certainly the two issues you mention (analyzing vs. building) will have long since integrated into one....much like long ago the study of the phenomenon of electricity (think flying kites in thunderstorms) integrated into the building of useful devices using electricity. There is no need to think of them as two separate issues.
digital media (Score:2, Insightful)
This forces some processes to be essentially digital, but most of biology is an unbelieveably messy analogue nightmare for anybody trying to figure out what's going on.
Re:digital media (Score:3, Informative)
Mutation is the grist for the mill of natural selection. Were it not for mutation, Earth would be a swamp of highly advanced algae right now. What you want is a balance between mutation and error correction - enough correction so that organisms can survive and breed, but enough mutation so that you can have variation that will allow adaptation to new niches.
Re:digital media (Score:2, Informative)
DNA encoding (Score:2, Informative)
It certainly forces humans to think of them as essentially digital, thus the digital model of the output of couple of million years of iterating one or more biological learning algorithms.
We didn't understand the evolved FPGA pattern implementing an XOR either, although we think of it as a bit patt
Jumping the gun here, buddy (Score:5, Interesting)
So how can we restructure our current computing system to a model that is based upon something that we understand only at basic level? We can't. While I agree that a biologically-derived computing architecture could be quite powerful indeed, we are still a LONG way off from the level of understanding needed to even put this idea on the drawing board.
Question tho... (Score:3, Informative)
The first things that come to mind is, "What time frames are you speaking in for this technology?" and "What application are you talking about?" Each of these are very important.
If you are talking raw number crunching, it might end up having some problems with competition with rival technologies. The High Productivity Computational Systems Effort @ DAPRA is intended to bridge the gap between current supercomputers and quantum computers in capability. If the realistic xpectations for quantum computers are realized, and not the hype, then it might end up making the biological tech a case of an 'also ran' much like gallium arsenide seems to have become. Unless there is something that biotech processors do better than the traditional architectures and the projected quantums, then it might remain a lab curiousity.
On the other hand, if you mean something else, like revolutionary computer-human interfaces, or AI work, or something I'm not thinking of, then we might see something generated from this indeed.
If you could be more specific about what you have intended this technology applied to...
Michael Caudy on biology... (Score:5, Interesting)
Blue Goop Of Death (Score:2, Funny)
"I'm sorry, your DNA has just crashed. You're experiencing the blue goop of death."
Of course, all the geeks would run their DNA on Linux. They'd be capable of doing many things faster, they'd live forever compared to their microsoft bretherin and the vast majority of society would never, ever, want to interact with them. So no change there then.
Re:Blue Goop Of Death (Score:2)
> Linux. They'd be capable of doing many things
> faster, they'd live forever compared to their
> microsoft bretherin and the vast majority of
> society would never, ever, want to interact with
> them. So no change there then.
Except that the Linux geeks would forget everything they learned as soon as they went to sleep. To get around this problem, they'd start keeping "Journals" that they'd have to read before they started their day. Eventually
Re:Blue Goop Of Death (Score:2)
People running their DNA from Microsoft will experience periodic downtime, intentional or not, and will occasionally have to have a friend or relative come by, wipe everything clean and replace it with a new copy.
People running their Linux brand DNA would have much less downtime than those using Microsoft's brand, however they would have to go extinct and re-evolve from the primordial soup every so of
An answer from a different perspective (Score:5, Interesting)
Re:An answer from a different perspective (Score:2)
Blazingly fast ? WTF? I think you meant to say that silicon conduction is blazingly fast.
Nerve impulses can be measured in tens or hundreds miles per hour, pulses over wire or silicon is measured in tens of thousands of miles per second.
This page [washington.edu] is aimed at kids but happens to have a good chart of various speeds of various nerves; the top speed they show is about 225 mph, and they compare
computational ecology and techniques (Score:5, Interesting)
I don't think biology will rewrite CS. It will influence it, for sure, but there isn't anything fundamentally different between a biological solution and a technological one. I think as we learn more of the bigger picture in various biological fields, when we truly understand it, we will integrate that knowledge into applied CS. We've been reading the book for some time now, but we really don't know enough about the subject matter to really apply it.
I think there is a lot of use for biomimicry in computing. I think integration of biological elements into our computers is quite a bit far off and perhaps a bit sci-fi-ish for now, but taking ideas (algorithm would often be an understatement) that work well in biological systems and using them in computing is something we can do now with some success.
Some thoughts (Score:5, Interesting)
Re:Some thoughts (Score:2)
Re:Some thoughts (Score:2)
Binary, Genetics, and the ARMY (Score:5, Funny)
The ARMY has live soldiers and dead soldiers
Binary has 8 bits to a byte, DNA has 3 nucleotides to a codon.
The ARMY has 8 to 10 soldiers to a squad.
Computers and biology seem to have a natural fit;
The ARMY also seems to fit the computer model using the same criteria. Does that make it a computer?
models of biocomputing (Score:2, Insightful)
IANACS (I'm a biologist) but, Most of today's models of computing still go back to Turing. New models of computing will most likely spring up in the future that are based on the function of biomolecules, metabolic pathways, et al. Work has already been done showing how gene regulation is analagous to boolean expressi
Biology rewrites computing...the ultimate monitor (Score:2, Interesting)
Everything technology attempts to mimic everything natural, like your monitor for example. It is a visual representation of the world and the information therein.
Predicting Technology (Score:5, Insightful)
Technology is not morally or societally neutral, despite what we would like to think. A very simple example of this is the car: cars, in order to maximize their utility, require a vast network of roads, parking spaces, and gas stations. This network is expensive to society for environmental reasons and has definite social and economic effects (such as time lost in commuting and traffic jams). These are unavoidable if we wish to use the technology of cars.
I have an essay in progress on this topic: The Analysis of Technologies [berteig.org] - its got some stuff that is quite out of date since I started working on the essay eight years ago :-)
A really great book on the subject of analyzing the future effects of technology is "In the Absense of the Sacred" by Jerry Mander. This book is very much slanted politically to the "small/simple is beautiful" outlook, but provides a very substantial wealth of logical arguments and academic studies to demonstrate some of the necessary principle of analyzing technologies.
As for your specific questions, one obvious effect will be that in our commercial environment, not everyone will have equal access to the benefits that may be provided by the integration of computational and biological technologies.
Since it will not be genetic engineering in the "traditional" sense, this technology may be used as a backdoor for creating designer babies without actually modifying a zygote's genetic material.
Obligatory Comment? (Score:4, Funny)
This is old news... (Score:3, Funny)
Tech isn't there (Score:5, Insightful)
Of course, there is a downside. Massive parallelism means that programming will become orders of magnatude more difficult. People today can barely wrap their heads around out of order instructions and code that works well with superscaler architectures. What happens when we increase this complexity by a million fold?! I'm thinking that bio computing could produce some rather interesting advances in the way we communicate/program computers.
Bioinformatics (Score:2)
To answer part of your question, there are many parallels between biology and computers, however some biological systems are much more complex and can only be modeled to a limited extent right now. Some systems are more easily examined in terms of circuitry, but we are still only half way to knowing
Some applications already being considered (Score:2)
Could be really bad... (Score:2)
course the pr0n industry will love the crossover applications...
Two scales of the same issue (Score:2)
Mask? (Score:2)
My computer is ill! (Score:2, Interesting)
Well, lifeforms have certain weaknesses that rocks and electrons alone do not. Among them are:
-A lifespan
-Virus vulnerability (no pun intended)
-Nutrition requirements
(your typical cell needs things that are harder
to mass-transport than electrons. Water comes to mind)
"Junk DNA" == Data stashes? (Score:5, Insightful)
The first hour or two of disassembling was figuring out where the code was, and where the data was.
The next day or so of poring over those printouts were spent mapping out where the entry/exit points for subroutines were.
I got to the point where I could guess where game graphics were, just by looking for oddly repeating patterns in the "data" areas. (Yup, in binary, those 8-byte sequences make up the bitmaps for the characters "A", "B", "C"...")
"Oh, XX AA XX BB XX CC, somewhere near XXDD in memory space. Must be a list of pointers to something."
"Oh, XXAA, XXBB and XXCC all start with the same byte, and that byte is XXAA minus XXBB (or XXBB minus XXCC). Now I know how big each element in the structure is."
And so on. The first day or two of hacking would result in me figuring out about 10% of what the data was for.
The other 90% was the hard part, typically requiring running some coke, poking at the data, and running the code again to see what changed. "Maze wall moved here, then things crashed when I tried to walk through it."
Sure, 99% of our genome might be junk. There were plenty of areas of address space that contained "data" that was never accessed, even with the tight code written in the 8-bit days.
But when I found a string of bytes I didn't understand, the working assumption that usually went better for me was that "I don't know what this stuff does", not "these bytes are random".
I'll bet that 90% of the genome is never executed nor referenced as data. (Evolution's a messier programmer, and there's 4.5 billion years of cruft!) But I'll bet that a lot of that "junk" is just code we haven't reverse-engineered yet.
Ramblings over - to the poster, all of the ideas in this post are probably ancient history (and poorly-written at that - you can tell I have no bio background), but it's nice to see I'm not completely off my rocker.
I went the CS route because when they taught biology in high school, it was seen as preparation for "become a doctor". Nothing wrong with doctors, but I was interested in more interested in hacking and figured it would be a long time, if ever, before we could manipulate DNA the way I could manipulate bits on a machine. (I've been pleasantly surprised with the way things turned out, though! :)
CS grads are a dime a dozen in the job market; I like my job, but career-wise, the field's been played out. If you're about to go into college, and especially if you like to reverse-engineer stuff "because it's fun", get into bioinformatics, computational biology, and do your CS as a minor. At least, that's what I'd do if I were gonna start over.
Re:"Junk DNA" == Data stashes? (Score:2)
Just like women, 90% of the places I poke at cause me to crash face first into the ground. Amazing!
Re:"Junk DNA" == Data stashes? (Score:4, Informative)
The equivalent in computer science would be if you plotted every possible route through a program and some code is still never conceivably executed, that would be the equivalent of "junk DNA". Even if you went into the machine language code and replaced it with random values, the program would still never crash because it never executes.
In the computer world we tend to call that "dead code".
Thus, we do know that the "junk" is truly junk. The debate on its usefulness centers around the other physical implications of the existance of such DNA, and where it might have come from, but "computationally" (in biological terms "is it ever used to produce a protein?") it is indeed junk.
Please consult any elementary (but up to date... the understanding of junk DNA has progressed a lot in the last decade) textbook on genetics.
Re:"Junk DNA" == Data stashes? (Score:2)
Cool. I didn't know we had that level of ability when it came to profiling DNA code. (In fact, I almost exposed more of my biotech non-cluefulness by saying "It'd be way cool if someone wrote a DNA profiler, so that we knew what chunks of code were/weren't accessed during a typical organism's lifespan".
I retract my "Junk ain't necessarily junk" statement.
I also agree that the other interesting thing is
Wisdom is understanding (Score:3, Insightful)
I agree with the parent post. We don't actually know what this "junk" is for. In my own projects, I have certain things turned off and on by dummy variables and seemingly unnecessary if-then statements. You could very well go into my stuff and say "ten percent of this stuff is unnecessary. You could erase it or replace it and nothing will go wrong.
And you would be right. Except that you killed off some functions not meant for today. You killed
Re:Wisdom is understanding (Score:3, Interesting)
Nothing you say contradicts that.
Also, don't forget Nature is not purposeful. Putting useful stuff in the junk is not useful, because you're no more likely to mutate such that the formerly useful code is expressed then you are to mutate such
Re:"Junk DNA" == Data stashes? (Score:2)
Re:"Junk DNA" == Data stashes? (Score:2)
Answers to questions (Score:4, Interesting)
Biology will (extremely slowly) be integrated into modern day technology. There will be some technology ---> biology transition too. However biology is far more adaptable. It's not a case of rewriting - it's just a case of historical progression.
In answer to your second question - technology concepts, computing etc as they're designed by biology are already in mainstream use eg:-
computer
phone
automobile etc
Biology affecting technology has had less of an effect eg Velcro - however the balance will change over the next few decades. Biotech is already advancing in great strides.
There isn't any definition as such - predicting the future is all guesswork. You can use statistics - all kinds of methods - in the end it comes down to a gut reaction.
Q. How will the nature of computing, and how we perceive it, change due to biology integration?
It'll become easier for biology to use eg:-
handwriting recognition
voice recognition
etc etc etc (all fifth-generation tasks - read up on sixth-generation if you like)
This is due to technology "evolving" to become more link biology though. The change'll happen too slowly to perceive.
Q. More to the point, how much of the theory we learn today may change?
The fundamentals still remain the same - like mathematics though - it just gets more complicated.
Q. What will be the biggest issue determining the success of the adoption of biology-integrated computing?
Economics. When computers cost millions of dollars only governments and large organisations could afford them. The second problem is marketing (read persuading people they need them). It'd take years though - look at the computer mouse as an example.
Q. Will it be technology factors or will it be societal factors (e.g., rebellion by the Right Wing), or something else?
It'll just happen - although factors will influence how slowly/ quickly certain parts of it do. Technology in the end comes down to ideas + money.
Q. What things must hold true to make the idea succeed?
That we can understand biology & manipulate it to serve us (probably other things too).
Q. And perhaps the hottest issue of all: Is there anything inherently wrong with pursuing this avenue?
Not in my opinion - although all technological advances bring ethical dilemnas - who do you sell it to etc? What (out of many) uses do you put it to?
Q. What may be some of the consequences?
A lot of them have already happened or are in the process of happening.
A society that suffers from greater obesity, global communication, increasing reliance on power production etc etc etc
Related reading (Score:4, Informative)
I/O Speed Critical... (Score:3, Interesting)
I have to think that both technologies will come to a point where they can't advance without the other, at least in the medium-term. We know (or think we know) that silicon will reach barriers it can't overcome. And at this point, we don't have a way to create complex biological computers without using existing complex organisms and therefore shooting ourselves in the foot politically. Before real-world interfaces to biological computers can be developed, we need an efficient way to interface with the biology at a low level. Traditional computers will have to provide this for us.
We may even see a true, permanent mesh of the technologies. Silicon is extremely good at some things (communications; providing an interface to mechanical items -- keyboards & mice, monitors, speakers, solar panels, servos, etc.), while it's hard to imagine really good natural language processing, learning, and nonlinear problem solving, much less a modicum of emotion to enhance usability, occuring without biology.
Who knows? My prediction is as follows:
Just a little fantastic speculation...
--Jasin Natael
Cells aren't simply complex computers (Score:4, Interesting)
I think that biology will push computing into interesting directions, not through application of any biological principals we discover, but through the demands of biological investigation. Biological systems are too interconnected to be adapted to building software or computers. I take that back, the details of biological systems are too interconnected to be adapated to building software or computers, but the gross principals (e.g. the immune system: T-cells, B-cells etc.) will be increasingly copied in software and computer design.
I believe that eventually we will be able to write complex organisms from scratch. These may not be as robust as what nature produces, but will be useful to us in many fields. Starting with the medical and spreading through the agricultural and even industrial area. I dream of trees which produce a sap, which is easily refined into methane or natural gas. But it's going to take much longer than most people seem to think.
Your mistake & my Thesis paper (Score:3, Interesting)
As a medical student with an undergraduate degree in Mathematics, I'm really pleased to see that other scientists are getting excited about the convergence of Mathematics/Compuation and molecular genetics.
First let me correct the slight error in your Ask Slashdot submission: we say that there are three nucleotide bases in an mRNA codon (not DNA codon). If you want a review of how DNA becomes RNA becomes proteins, you can check out the intro to my undergraduate thesis paper (link below).
In fact, I would encourage you to read through my paper in any case, as it may stimulate your thinking or open you up to new areas of bioinformatics research. The paper focuses mostly on a survey of analytic techniques of gene-expression microarrays, but is highly accessible to well-read / intelligent persons (it is light on technical mathematics by design).
Please let me know what you think of it (my email address should be easily inferrable from my website address), and you get a high-five from me if you can find the glaring mathematical error that I didn't get fixed before my defense.
http://blachly.net/james/documents/thesis.html [blachly.net]
The best,
James
To hell the luddites. Hack the genome. (Score:3, Interesting)
To hell the luddites. Hack the genome.
With apologies to Steven Levy:
1) Access to the genome, and anything which might teach you something about the way life works, should be unlimited and total. Always yield to the Hands-On Imperative.
2) All information should be free.
3) Mistrust authority- promote de-centralization.
4) Hackers should be judged by their Hacking, not bogus criteria such as degrees, age, race, or position.
5) You can create art, beauty and even life by hacking DNA.
6) Genetic hacking can change your life for the better.
here's some answers... (Score:2, Interesting)
--In the long run, will biology rewrite computing or will modern day technology concepts and theory be integrated into biology? If both are true, which will have the greater effect? I understand long run is ambiguous in this question, but Iâ(TM)m
DNA computing and bioinformatics (Score:4, Informative)
I my self am in the field of bioinformatics/molecular biology [uwp.edu] with my primary interest being in RNA regulation and regulatory elements. I am trying to find and figure out how RNA regulation works in model systems.
Now for your questions...
>In the long run, will biology rewrite computing or will modern day technology concepts and theory be integrated into biology?
Both will happen...
>If both are true, which will have the greater effect?
I don't know about biology rewriting comuting. First, yes DNA encodes information 'like' binary 1's and 0's, but we are still figuring out the system works. We know how to find some genes by just looking at the sequences, but we still have the problems with predicting genes in a sequence (e.g. gene splicing, post transciptional events, etc.
I think it would be more sane to use the modern day technological concepts and theory, but with an emphasis on parallel computing.
>I understand long run is ambiguous in this question, but Iâ(TM)m interested in all thoughts using any applicable definition.
Tied to the first question: How will the nature of computing, and how we perceive it, change due to biology integration?
Well we can have those clean computers powered by photosynthesis... ok, all kidding aside, it change computing for those tasks DNA would excel at: A DNA computer is a type of non-deterministic computer [dna2z.com]. We have to overcome some of the problems imposed by DNA... its a chemical that is in an aqueous environment that tends to mutate over time; also the DNA computers I have seen work in a test tube, and you have to sequence it to get a result. That should hopefully change in time.
>More to the point, how much of the theory we learn today may change?
In biology - a sh*t load most likely; like I said above, we are still trying to understand biological systems and how they interact with each other, including DNA and how it codes for life.
>What will be the biggest issue determining the success of the adoption of biology-integrated computing?
Get it out of the test tube first... place it on a chip, like a microprocessor. Also the energy source... I don't want to share my doritos with my desktop...
>Will it be technology factors or will it be societal factors (e.g., rebellion by the Right Wing), or something else?
Don't like the right wing, eh? Well as a card carrying member of the vast right wing conspiracy, you have just as much to worry about from the left... those environuts who think we are tampering with nature (like we haven't been doing that for the last 10000+ years (e.g agriculture). Both extremes muzzel science... get used to it.
If we start to integrate computers into our selves... yeah I think society will have some issues to face about what it means to be human. (I'll go with David Hume with this gem "I'm human because my parents were human")
>What things must hold true to make the idea succeed?
1. Perfect DNA computing
2.
3. Profit -- of course!
Ok, seriously -- there need to be interest in the scientific community, we need to figure out how DNA works in living beings... how it encodes all its data (and how about that junk DNA?). We need to get it on a chip (not a microarray chip... some times called DNA chips). And there needs to be a profit motive.
>And perhaps the hottest issue of all: Is there anything inherently wrong with pursuing this avenue? What may be some of the consequences?
Hell no! But if you are interested in DNA computing, the bioinformatic
Bio is catching up (Score:3, Interesting)
DNA decoding is starting to pick up on some of the debugging concepts that have been in the digial world for 50 years. There are ways to iterating over code so it looks like the single steping is going places. Its just hard to pull off on a multithreaded cluster and understand whats going on.
Of course what they are having a real problem is with the DRM stuff thats making it hard to build replacement brains out of stem cells.
The real question is... (Score:2, Insightful)
Computing has accelerated biological research, but I'm unconvinced that it will fundamentally alter the prevalent paradigm in the biological sciences. OTOH, biology may provide the concepts that will push a change in computer science.
Not being a computer scientist, I can't say this for sure, but I think one of the places to look would be in membrane potentials & how that might be applied
Ethics (Score:3, Insightful)
* What will be the biggest issue determining the success of the adoption of biology-integrated computing? Will it be technology factors or will it be societal factors (e.g., rebellion by the Right Wing), or something else? What things must hold true to make the idea succeed?
First, I know it's only an example you've given (lit., "exempli gratia"), but the "societal" factors as you call it -- more political, really, but let's compromise on socio-political -- are not an exclusively "Right Wing" threat. The modern Left holds many central beliefs contrary to the integration of technology and biology, especially concerning human biology, for instance the primogeniture of society over the individual and (partially by extension) the malleable, ahistorical understanding of the human mind (a notion commonly referred to as "tabula rasa"). Under this view, attempting to "improve" or in any way alter humans as conscious beings by improving or altering us as biological beings will seem either immoral or, more likely, futile. This mostly to point out that limiting factors for the progress in your field don't come exclusively from conservative ideology.
In general there seems to be a growing trend in intellectual/ethicist circles toward acknowledging the massive (though far from exclusive) importance of our evolutionary past, which in simple political terms is more or less centrist or apolitical, though could be interpreted as slightly "right wing" (more libertarian or classical liberal than conservative), which suitably allows you scientists to carry forth your apolitical and almost-amoral research, leaving as the likely culprit for "most likely to impede the progress of biology-integrated computing" common economical factors: what innovations will ultimately create the most value, and therefore what innovations will proximately be most likely to succeed (in getting funded, in getting researched, etc.). And if you take exception to my "almost-amoral" comment (which you shouldn't), I mean it compared to people who spend their lives sweating over the ones and zeroes of right and wrong -- not that you value ethical behaviour any less than they do, only that you likely (likely) pay less attention to the nuances of what makes ethical behaviour ethical; my guess is you probably subscribe to a simplistic (and ages-old and approximately, though probably not absolutely right) axiom like the biblical (new and old testament) reciprocating Golden Rule or the commission-of-harm-avoiding Hippocratic Oath -- good on you.
* And perhaps the hottest issue of all: Is there anything inherently wrong with pursuing this avenue? What may be some of the consequences?
There are numerous criteria for wrongness, and in the case that you mean moral wrongness there are numerous defensible moral systems. Also, if you mean specifically moral wrongness, most moral systems taken into consideration not just consequences of actions, but the intents behind them as well (brick-in-the-head obvious example, Western legal tradition's distinction between premeditated and non-premeditated murder, or either of those and accidental homicide) -- if you meant to imply the connection I've understood between inherent wrongness and consequences.
I don't see anything one could construe as inherently wrong with the research you propose, if you don't believe in God or make intuitive essentialist ascriptions to the human form or subscribe to the aforementioned primogeniture of society over the individual and all that entails. In other words, assuming of course otherwise ethical behaviour, if you're a modern freedom-loving humanist (or, in the trivial case if you're a nihilist), it seems to me there's no basis for having qualms about the philosophical nature of what you're doing -- but it's healthy of you to be wary of slipping into less-than-savoury situations, and to constantly question yourself to defend against aforementioned slipping and to ameliorate yourself -- no doubt the skeptical scientist in you.
Computer subunits (Score:2)
My opinion (Score:2)
Biology may help build better computers, either by "growing" things like media, or with nanotechnology indistinguishable from biology being used to grow chips.
However, the "ultimate" convergance of a biological computer is not going to happen, except perhaps in an isolated sense where it can be made cheaper to grow a computer. The problem with biological computing is that generally we want to compute, not b
The important question of course: (Score:2)
You do know (Score:3, Funny)
Obligatory Subject Here (Score:4, Insightful)
One thing I want to say before responding to your points: nature is _NOT_ "efficient" like computers are "efficient". Natural systems are enormous, ad-hoc, kludges. They work extremely well, and have tons of redundancy and fault-tolerance, but that's mainly due to about 4-billion years of slow, brutal, optimisation by the evolutionary process. Natural systems do certain things faster than computer systems because:
1. They've been optimised for a hell of a long time, and they've found ways to engineer and construct extremely complicated structures and processes that are still "small" (compared to modern human-engineered technology).
2. They've been allowed to search through a much larger solution space than what computers have searched through. Computers are inherently limited by the fact that they are tools which can still be reconciled for a large part with human reason - they were constructed using models that humans can understand and reason about, and explain fully from the start. Evolution, on the other hand, is much more of a blind search.
Another thing to note is that natural systems all try to solve one problem: existence and self-perpetuation. All the natural systems we are able to observe today exist because they are structured such that they can fulfill these basic requirements. Now, in the process of solving this single-minded problems, nature has managed to come up with solutions for many other problems - many of which can be borrowed and applied to human problems. But it's erroneous to think of nature as "god's textbook of problem-solving", or anything like that.
> In the long run, will biology rewrite computing or will
> modern day technology concepts and theory be
> integrated into biology? If both are true, which will have
> the greater effect? I understand long run is ambiguous in
> this question, but Iâ(TM)m interested in all thoughts using any
> applicable definition.
There are two aspects to this - borrowing ideas from biology (i.e. reimplementation), and borrowing biological structures themselves (e.g. using bacteria to make enzymes, viruses as delivery vectors for drugs, growing muscle tissue for robot-locomotion, etc.). Both are happening to a certain extent.
I think it'll be a while yet before we will be able to jump into biological systems and "change the code to do what we want". We do it in really primitive, crude ways right now, but the level of complexity of biological systems, I think, will mean that it'll take time before we are able to fully control them.
>Tied to the first question: How will the nature of
> computing, and how we perceive it, change due to
> biology integration? More to the point, how much of the
> theory we learn today may change?
I don't think biology will change theory that much. CS theory comes from the human reasoning process. I don't think there are that many abstract concepts that we can extract out of biological systems. I think the real impact will be in engineering aspects - mimicing, or reusing wholesale, biological structures to acheive the properties that we want.
> What will be the biggest issue determining the success
> of the adoption of biology-integrated computing? Will it
> be technology factors or will it be societal factors (e.g.,
> rebellion by the Right Wing), or something else? What
> things must hold true to make the idea succeed?
Forget the right wing. They make a lot of noise, but ultimately they are not that powerful, especially in the capitalist west. The religious conservatives are used as a tool to get votes, by pandering to their pet causes, but once people figure out a w
exploiting opportunities (Score:5, Insightful)
Presuming you're not a creationist, there are MILLIONS of generations worth of Darwinism at work in even a simple worm - weeding out the inefficient in times of stress, etc.
Granted, the process in biology is neither linear nor even relatively efficient, but there are tremendous lessons in autonomous operation, fault-tolerance (HUGE), adaptability, etc that bio systems can teach or implement in computer situations - what can bio-systems get from computers? It just seems natural (ha!) that the more we learn from bio-systems, the more we'll apply it to computer paradigms. Until now, it's been too complex for us to really understand.
Thoughts from someone in comp. biology (long) (Score:5, Informative)
Oh my GOD!!!! (Score:2)
what about protein chemistry? (Score:2)
Dealing with Complexity (Score:2, Insightful)
When I was learning the concepts of EJB's and remote processing, I began to see patterns between how Cells intercommunicate and the way Clients/Servers work. I began to see similarities between how cells set up "firewalls" and open firewalls and the way we do it in computing.
As an OO p
How DNA computers will work (Score:2, Informative)
Computers replace petri dishes... (Score:2)
Biology and Computing (Score:2, Insightful)
There are important similarities between the information processing and transfer in living organisms and mathematical computation, which have been recognized for more than 50 years (see Gunther Stent's "Paradoxes of Progress" for some essays on the nature of genes and biological information transfer as the central dogma was emerging). But there are critical differences as well, which are often misunderstood.
The fundamental difference between computing in biology and computing with man-made computers is t
Some thoughts (Score:2)
> or will modern day technology concepts and
> theory be integrated into biology? If both
> are true, which will have the greater effect?
> I understand long run is ambiguous in this
> question, but Iâ(TM)m interested in all thoughts
> using any applicable definition.
These aren't really "either-or" but "both"
Computing will probably evolve to DNA-based biological computation to some extent. There will remain an "inorganic" el
As a guy designing biological circuits.... (Score:3, Interesting)
(Sorta strange how Minnesota is a big center for medical devices / chemical engineering)
I'm in the process of designing systems of genes that interact to perform specific functions, like switches, oscillators, filters, etc. I won't go into a long harange over how it's done or the detailed specifics, because if you're really interested you can read my paper to be published in 'Computers in Chemical Engineering' that will be published sometime in November/December. (Yes, shameless self-promotion.)
Very briefly, systems of biological reactions occur in such small volumes and in such small concentrations that the traditional mathematics of describing chemical reactions breaks down. One requires probability theory and the usage of Markov processes, a type of stochastic process, to accurate describe what's really going on inside cells. One does this with a very handy algorithm developed by a guy named Daniel Gillespie (search the literature if you're interested) and big freakin computers. (I'm going to gloat: I'm getting access to the 54th fastest computer in the world. Oh, fellow Slashdotters, it brings a tear to my eyes...)
Here's my two bits on the subject of integrating biology and computers...
You have two distinct areas of computational biology (as Slashdotters know it) that will probably go into different directions. One can use computers to design biological systems in order to perform certain functions (medical, industrial, etc). This is entirely analogous to an engineer using a computer to design a factory before building it...and knowing exactly (or almost) how it will all turn out _prior_ to building it. This is also why buildings don't regularly fall down.
Then you have the Cyborg fantasy... Ie. Putting computers in your body to somehow enhance performance. Well, I would say that is numerous decades away because we currently lack the understanding of our brains...and the enhancement of our brains' computational speed is the only area in which digital computers can enhance human performance significantly (I discount super strength as novelty rather than enhancement.)
But, there is a useful aspect to the 'cyborg' fantasy: Using designed cells to enhance the performance of humans. Cures to many of our current diseases require significant changes to our DNA and/or microscopic structure of our cells. Currently, the approach has been to design (or discover randomly...) molecules that interact with our cells in a way that improves our health.
Now extend that thinking further... What about designing whole cells to interact with our cells in order to improve health. Here's some examples that may come true in the next twenty years:
A cell (of human origin) that lives benignly in one's body until it detects a protein that is only produced (in large quantity) by a cancerous cell. When it detects large numbers of that protein, it may do the following actions:
--Replicate itself quickly (in a controlled fashion, unlike cancerous cells, however)
--Warn the person by producing a visible indicator (ie. make the person urinate blue (har har))
--Recruit the person's immune system to attack the cancerous cell
--Attack the cancerous cell itself (phagocytosis, etc)
--Produce a molecule (a drug) that is known to kill that cancerous cell
Here's another example:
Someone designs a microbe that detects one or more specific chemicals in order to alert humans of its presence...a biosensor.
When the microbe (or its ten+ million neighbors) detects a specific chemical (Anthrax, ricin, smallpox, influenza, etc, etc), it produces a green fluorescent protein (GFP)..and tells all of its neighbors to produce GFP too. Thus one has a very sensitive, very specific biosensor. Place 'em in every airport and seaport in the world and one now has an (almost) instant indicator of the presence of such toxins...
So, to answer one o
Existing sources for this topic (Score:5, Informative)
-David Barak
Re:Biology and Computing Convergence = nonsense. (Score:2)
Computing can potentially take place on a biological platform. There's already been some work on this. Very preliminary, but you have to start somewhere.
And DNA is an encoding system. It stores information. RNA copies it.
Re:Biology and Computing Convergence = nonsense. (Score:2)
Re:Answers (Score:2, Interesting)
A1: No. The informational aspects of DNA have been known for 50 years, only slightly longer than computers and failed to influence computer development. Nor does the computer science theory of infor