Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
News Science

Scientists build DNA based computer 333

Archangel Michael writes "Israeli scientists have built a DNA computer so tiny that a trillion of them could fit in a test tube and perform a billion operations per second with 99.8 percent accuracy. Yahoo News has the story"
This discussion has been archived. No new comments can be posted.

Scientists build DNA based computer

Comments Filter:
  • 99.8%? (Score:4, Funny)

    by Anonymous Coward on Wednesday November 21, 2001 @06:52PM (#2598401)
    Are they sure that the calculation just isn't off by .2%?
  • Nice start, but... (Score:4, Insightful)

    by Mendax Veritas ( 100454 ) on Wednesday November 21, 2001 @06:54PM (#2598412) Homepage
    99.8% accuracy is fine for a proof-of-concept demo, but as always, the devil is in the details. This won't be a useful technology until it can do a hell of a lot better than that. I certainly wouldn't trust my PC if it made mistakes on .2% of its calculations. Who knows, it might take several years to develop a really usable version of this, or it might never get into the market at all if, say, other technologies can beat it to market or have better cost/performance ratios.
    • by geekoid ( 135745 ) <dadinportland&yahoo,com> on Wednesday November 21, 2001 @06:59PM (#2598457) Homepage Journal
      what if it ran the same calculation, multiple times, then used the resulting "average"?
      it seemes to me you could get at leat 5 nines out of that.
      so we'll have organic computers, man my frame rate sucks, someone poor some more beer in the CPU holding tank!
      • Then your computer would be five times slower.
    • by tempmpi ( 233132 )
      This isn't a big problem. There a lot of algorithms that have a good fault tolerance or you can just calculate things again to check if your solutions is ok. There are lot of technologies that make much more mistakes in their raw state without error correction. Think of DSL or CD-ROMs/DVDs. These ones make a lot of mistakes in reading or transfering your data but correct them at a later state.
    • > 99.8% accuracy is fine for a proof-of-concept demo

      There are a host of applications where that kind of accuracy would be great. Think about 3D rendering for games. Do you care that an occational pixel is slightly off color if it means you can render the entire scene in MUCH greater detail? There are also many applications in things like simulation. Lastly, with calcuation power to burn you can always run a given calculation multiple times and then use standard statistical techniques to get arbitrary levels of certainly about the accuracy.

      > Who knows, it might take several years
      > to develop a really usable version of this

      Of course it will. I don't think anyone claimed you'd see this replacing your Pentium. However, think big! Things like DNA computers and Qauntum computers will eventually make our current silicon chips look like toys.

      Steve
    • Absolutely, every couple of months there is a new news article about a ground-breaking new type of computer. But each time, it's basically just "hey look, we managed to get this to do something that kinda looks like basic computer operations". Quantom computers sound really cool, DNA computers sound really cool, but where is a reasonable long term plan? Where's something to actually get excited about?

      I can build AND, NOT and XOR gates out of cats, mice and string. I can string a thousand of these gates together... but i won't be able to install an OS on it in any practical way.

      I'll be excited when one of these test-tubes can play mp3s, compile my kernel, and send me instant messages telling me what website i can see AVIs of Britney Spears being ravaged by high school football players at. Until then, i just don't care.

      The abiility to do FLOPs does not a Turing Machine make.
        • I'll be excited when one of these test-tubes can [...] send me instant messages telling me what website i can see AVIs of Britney Spears being ravaged by high school football players at.

        Jeez, any of the Kazaa clients will get you that.

        I agree though.

        • "it could form the basis of a DNA computer in the future that could potentially operate within human cells and act as a monitoring device to detect potentially disease-causing changes and synthesise drugs to fix them"

        Whoa there! When I go to a doctor today with generic symptoms, I'm advised to wait a month and see if I get better by myself. Let's work on basic diagnosis techniques first before we start blueskying about nanobots turning us into immortal super beings, huh?

    • Well, if they perform calculations N times over, they could get accuracy of 99.8% to the Nth power at 1/N the speed. That could be a useful technique in upping the accuracy while still getting reasonably fast computation.
    • ...the errors are not systematic. Do the calculation two times and compare and your unidentified errors drops 0.00004th of the whole (provided comparison procedure is not flawed), do it three times and it drops to 0.0000008 and so on. Once possible errors are identified, redoing them, say, ten more times to make sure is not difficult (as you only would have n*0.002 of them, n being the repetition count.) I'm sure one can devise a better system for error correction, but even this crude one would perform satisfatorily.
      • Well here's the thing: errors in DNA are an acceptable thing (in many instances) in a living cell. There is redundancy in the genetic codon -> amino acid translation which makes for acceptable losses in DNA integrity over time. Admittedly, they're just using DNA or RNA bases in these devices, but there is ssDNA binding which doesn't need to be 100% accurate either. In fact, there's a possibility for regular expression matching / diffs: The entirety of the two strings needn't match completely - some 'loops' where the two opposing bases don't match due and repel each other are normal. So the difference between two molecules (files) can be measured as a function of how well the fragments mate to each other. The regular expression stuff is easier: just synthesise the string you want to match and chuck it in (a la RAPDs - not a good technology).
    • to knock down that which they didnt design. Some kind of inferiority complex i think.

      Really though, the fact they can do this at all is quite amazing. Early electronic computers were plagued with similar issues (such as the infamous 'bug', a moth got stuck in a relay). Perhaps a speck of dust in the test tube threw off a few computations...the modern equivalent of that pesky moth.

    • I certainly wouldn't trust my PC if it made mistakes on .2% of its calculations

      Some things demand 100% accuracy. Some things do not.

      1. 0.2% mistakes are already good enough to compete with commercial text recognition systems.

      2. Nobody claims Neural net solutions are 100% today, yet they are already in widespread use.

      3. How accurate is your brain?

      I think 99.8% accuracy is good enough today for some applications.
      • 3. How accurate is your brain?

        This is a very good point. I think the average human can't be more than 90% accurate for most things, yet God been replaced by the current SlashDot crowd, it appears we would have been sent back for further testing and likely never implemented.

        1. 0.2% mistakes are already good enough to compete with commercial text recognition systems.
        To that I would add the digitization of just about all analog data: images, audio, temperature, viscosity, density, etc. Also, modeling any kind of system where key parts of the model depend on educated guesses of various parameters by human programmers. In other words we could build tremedously powerful computers for things like atmospheric modeling, or finding undergound oil deposits, applications that we currently build mulit-million dollar parralel processing arrays just to get 'acceptable' predictions.
  • I'll say how cool that is when they manage to put a trillion in a test tube and perform a billion calculations. This can be seriously cool, and I'll be there cheering when they have something a bit more impressive to show.
  • Ouch! (Score:4, Funny)

    by Zen Mastuh ( 456254 ) on Wednesday November 21, 2001 @06:56PM (#2598426)
    DNA can hold more information in a cubic centimetre than a trillion CDs.

    Man, a whole galaxy could have signed up for free AOL service with the DNA I just jetissoned...

  • Does that mean that you're running the computation a bunch of times each second, and 99.8% of the 'output' molecules give the right answer? So you could never be 100% sure that you got the right answer?
    • No, but you can be 99.8% sure...
    • No, but you could approach 100% accuracy by running the calculation multiple times and choosing the most popular answer. Sorta like math by popular vote. Which is still better than what happened in Florida last year.

      Of course, the calculation that tabulates the responses and calculates which is the most popular will only choose correctly 99.8% of the time... lather, rinse, repeat!
  • From the article (Score:3, Flamebait)

    by jonfromspace ( 179394 ) <jonwilkins@@@gmail...com> on Wednesday November 21, 2001 @06:57PM (#2598435)
    "We have built a nanoscale computer made of biomolecules that is so small you cannot run them one at a time. When a trillion computers run together they are capable of performing a billion operations,"


    I am no scientist... but a trillion of these can perform a billion operations? is this correct? can someone explain WHY it takes 1000 computers per operation?

    • My (uninformed) assumption is that they mean a billion operations per second. After all, a 'computer' can do an infinite number of operations, given long enough. So a trillion performing one billion ops per second implies about twenty minutes to do an operation... reasonable, based on what DNA chemistry I've done.
    • by salsbury ( 101341 ) <salsburyNO@SPAMsculptors.com> on Wednesday November 21, 2001 @07:04PM (#2598497) Homepage
      Probably because each one does a tiny bit of a computation. How many transistors are there in a modern chip? Uh-huh. Now you get the idea.

      When you're dealing at the atomic scale, just flipping a lever or doing something mechanical takes the place of all those little electrons flowing through logic gates.

      Given the level of our technology, I suspect that these little DNA "computers" are a lot more like a transistor than they are like a Pentium IV.

      To get your head around things at this scale, go to http://www.foresight.org/ [foresight.org] They've got several excellent nanotech books there that you can download electronically for no charge. Well worth it.

      Pat
    • Well, I'd assume that the term 'computer' is a bit of an overstatement, and that these individual "biomolecules" are more like individual transistors.

      1000 might be a bit much, but I'd like to see you pull off a MOV or CMP with only one transistor, or even a single logic gate...
    • by mdubinko ( 459807 )
      >can someone explain WHY it takes 1000 computers per operation?

      Maybe each operation is duplicated 1000 times, and the answer that comes out 998 times is chosen?
    • The article is pretty vague about numbers. There was a better article about DNA computing in the New Scinetist a couple of years back.

      A gram of material can contain 10^20-odd molecules. We are not really talking billions or trillions, but real monster numbers. Unfortunately the monster parallelism comes with severe I/O limits, and a low clock rate.

      Suppose you wanted to crack an RSA cipher. You could use one type of molecule to represent prime numbers, and a second molecule to take one of the first type molecules, and try it on the cipher key. If you start off with a few cc's of prime numbers, you will probably have all of the 40-bit primes many times over, so many molecules will make the right conection.

      Unfortunately, the molecules that make the right connection will be vastly outnumbered by the ones that don't, and the ones that went wrong, and the impurities, and everything else. To rescue the signal from the noise, you need another chemical stage. This should allow only the successful molecules to copy themselves. So you mix number solution 1 with RSA key solution 2, and stir it for a few minutes; then you add breeder solution 3, and wait for the most frequently encountered correct result to start crystallizing out.

      This is a wonderfully parallel process for searching for a single solution to a simple problem. RSA hackers, and Goooogle might be able to use it, but you can't use it to do your 3-D renders. Awww.....

      If we had to crack something like the Enigma codes today, then Bletchley Park would be developing DNA, instead of using relays and valves. The Bletchley Park Colossus was not a computer in today's sense - it was dedicated to solving a single problem - but the same people that developed it also worked on the earlier computers.

      Other people have suggested making molecules with the electonic orbital equivalent of the electrical components we have in present circuits. But that was not what that article was about.

  • I was so sure that I wanted to be an EE, but now I have to choose between that and genetic engineering?

    DAMN IT!

  • Oh God NO! (Score:2, Funny)

    by WyldOne ( 29955 )
    Now I'll have to buy anti-biotics for my computer when it gets a virus! I wonder if it will be covered by an HMO?

  • Where do you plug everything in? or are we going to have to use microscopic keyboards and mice?
  • by Anonymous Coward on Wednesday November 21, 2001 @06:58PM (#2598448)
    but the kids only have a 60% accuracy. My wife blames me...

    :(
  • So you do a trillion calcs per sec, and out of those, 80billion are wrong?

    Let me know when you have a use for 80 billion wrong answers. I have loads of them already without even having to calculate them!

  • Gene Therapy (Score:1, Interesting)

    by dasheiff ( 261577 )
    Maybe we could have intelligent robots going around fix rougue cells. This is already a procedure for many diseases, but now the DNA injected could be 'smart' DNA and know exactly what to change and what not too.
    • Maybe we could have intelligent robots going around fix rougue cells. This is already a procedure for many diseases, but now the DNA injected could be 'smart' DNA and know exactly what to change and what not too.

      Doctor: I'm sorry about the third arm growing out of the middle of your chest, Mr. Smith. It seems that the anti-cancer robot programming had an off by one error, causing every cell in your body to be mutated in various unknown ways.

      Yipes!

  • We think we just calculated how to safely detonate this nuclear device, but we're not entirely sure we got it right. Darn these DNA computers! Darn them to heck!
  • Think about the viruses people will make for these once they are common... *shudder*
  • oh wait, I guess that's what I am.
  • Does anyone else have a problem with using the fundamental building block of life to power a computer? How will they know that the source code to WindowsGM isn't the same as, say, HIV?

    I know it will probably all be in vitro, but what's going to protect me from getting infected with a stray snipped of 3D rotation code?

    Eek! Gives a whole new meaning to "virus".
  • Wow, does anyone else remember those from Masters Of Orion II research tree? They took forever to research, but they give your ships beam weapons +125% chance of hitting their targets! Man, now I have to crank MOO 2 up again for another go!

    - kengineer
  • by dankjones ( 192476 ) on Wednesday November 21, 2001 @07:02PM (#2598479) Homepage
    I was just thinking last night it would be great if we could invent synthetic mitochondria that could read our DNA and perform checksum algorithms.


    And then alert a repair mechanism when errors are found. It would probably need to survey other cells to compare results.

  • Now we can assimilate people.
  • until they catch up to my brain's computing power. ;-)
  • Had to say it, in bad taste of course

    Brings up the next question ... with a computer that tiny are you going to be required to use a magnifying glass in order to see the monitor ... and if you can use regular computer components ... will you have to have some kind of super small ps2 ports and what not? ...

    Can you power these with bacteria? ...

    hehehehee a square foot of these as a beowulf cluster ... and Does it run linux? :-)


  • Very interesting that they have gotten to the point where they can cut portions of DNA and test them to identify which functions they can perform enough to make a rudimentary "computer".

    Again, interesting - but one must wonder if this work is something inherently creative that should be protected by intellectual property laws, or if it is merely observing and splicing naturally occuring processes.

    It may be a premature concern though - but ultimately, what difference is there other than scope in using DNA-oriented systems to create protein computers, and today's circuit-based fabrication technology? How long will the prior art of nature stand before companies will own DNA sequences?

    Ryan Fenton
  • Computers keep getting smaller and smaller. In 1980 our IBM Series 1 4779 ($50,000 at the time), was the size of a refridgerator, and alot damn heavier. 21 years later our production servers are in mid-sized towers. In 1980, the thought of someone walking out of the building with our series 1 was just a laught. Today, its still questionable if someone could sneak a mid-sized case past security (Uh, yeah, I'm pregnant, and they think he'll have a square head).

    I've heard about server cubes already that are even smaller. Add onto that rack mount servers. Things are just getting smaller, which means they are easier to get out the door.

    What happens when my server farm is the size of a test tube? Unlclip the 20 pin cable the gives it power, connects it to the network, and runs the perifrials, and shove it in your pocket?

    Still somewhat difficult with great security. But no security is 100.0000000% perfect (Unplugged, in a cement block, under 200 ft of sand at the bottom of the pacific?). The only thing I could think of was to put one of those magnetic strips on it that the music stores (that I dont go to anymore) use? Metal detectors at the doors? DNA detectors?

    Anyways, any of you have any idea's for physical security when our servers start getting small enough to throw in a cigarette pack (a few years off)?


    • Well, if it's a reproducable system, then presumably, you wouldn't buy just ONE such server, but the RIGHTS to make and use a certain number of them. If someone stole your server box, you'd have to get another (presumably relatively cheap) replacement, dump in the fluid, connect to the most recent backup, and go.

      :^)

      Ryan Fenton
  • Newly developed gas additive turns every car into "Herbie, The Love Bug" (Yes, I'm old. Get over it!) Also, new secret ingredient in JOLT(tm) really can make you smarter! (follow with: Me burping the theory of relativity.)
  • Now that means we'll soon have new life forms that can be banned as circumvention devices under the DMCA...

    -- Shamus

    Bleah!
  • I don't know what to make out of this.

    "Since we don't know how to effectively modify these machines or create new ones just yet, the trick is to find
    naturally existing machines that, when combined, can be steered to actually compute," he added.


    and

    Israeli scientists have built a DNA computer so tiny that a trillion of them could fit in a test tube and perform a billion operations per second with 99.8 percent accuracy.

    In other words: We got lucky and hope to find something in the nature that will do the research for us. Seriously, they've got a long way to go. I currently don't belive DNA computers is the future. Chemistry is much slower than physics. I would rather have put my money - and efford - on making quantum dot - or optical computers.

    That's the future ...
  • Since our mode of thinking in the U.S.A. is that any technology that comes along should be implemented no matter the consequences, I predict it won't be long before we are all required to have biocomputers implanted. Basis:

    • Atomic Energy: this thing can destroy the whole planet when used in anger. Hey, we're angry with these Japanese...
    • Drug Testing: We can test for drug residue that remains long after usage and create a permanent underclass despite lack of correlation between job performance and drug test results. Let's do it!
    • Electronic Voting: (In the works...) Any computer system is hackable, but let's trust that this system will be unhackable.
    • Genetic Testing: This person has a gene that indicates a possibility of future illness. Despite our ability to buffer medical costs through group insurance, let's deny employment to this sucker. He can go hang out with the druggies...
    • Genetic Engineering: Instead of rotating crops, using companion planting guides, using natural predators, and recycling bio-waste, let's manipulate the genes of the plant. Ignore side effects in humans or other species.

    After all, if you have nothing to hide you have no reason to fear this technology, right?

  • BGOD (Score:4, Funny)

    by nick_davison ( 217681 ) on Wednesday November 21, 2001 @07:09PM (#2598521)
    "99.8 percent accuracy"

    "Yikes, I've got the blue gunk of death!"
  • What they describe is a computer in that it can take input and process to produce output, but since both input and output are in chemical form, how useful can this practically be? I'm not sure I understand how useful it is to have a trillion computers that, when infused with the right chemical mixture, all compute exactly the same data and arrive (99% of the time) at the same result.

    Honestly, how would you turn this into a practical computer? On the desktop? A supercomputer?
  • Unfortunately by analyzing all the dna contained in the test-tube the only answer the earth DNA computer ever gave was 41.496 or 98.8% of 42.

    Palestinian scientists, not to be undone began an RNA computer which would give the question to the enigmatic answer of 42.

    Unfortunately, the RNA computer was considered to be a circumvention of a DNA copyrighted device and the DNAMCA (DNA millenium copyright act) was invoked to assasinate the bioterroists and destroy their technology. to prevent unauthorized cracking of the DNA code
  • 99.8% uptime? Now, even DNA is more reliable than Windows.
  • From the article:

    "Since we don't know how to effectively modify these machines or create new ones just yet, the trick is to find naturally existing machines that, when combined, can be steered to actually compute,"

    DNA can be used in it's natural state to represent data. But once they figure out how to code DNA at will, then that would seem to be a breakthrough analogous to the the early punchcard computers.

    After that, the DNA transistor, right?

  • by interstellar_donkey ( 200782 ) <pathighgateNO@SPAMhotmail.com> on Wednesday November 21, 2001 @07:21PM (#2598587) Homepage Journal
    A billion calculations per second...

    99.8% accurate.


    Which means it'll make 2 million mistakes every second.


    I think my bank and government use these.

  • More Details (Score:3, Informative)

    by Great_Geek ( 237841 ) on Wednesday November 21, 2001 @07:25PM (#2598610)
    The Yahoo article is fairly content-free (and take a lot of space doing it). Here is the link to the the Weismann Institue abstract. http://www.weizmann.ac.il/math/users/lbn/public_ht ml/new_pages/Abstract.html
    Note that the 99.8% is what the abstract calls "Transition Fidelity" and is unclear what it means. I take it to mean that from input to output, the answer as read, is corret 99.8% of the time.

    It is interesting that they claim to be implementing a Turing machine. Previous uses of DNA has been mostly for the Travelling Salesman Problem with has a (more or less) natural mapping to DNA.
  • A link (Score:2, Interesting)

    by hyyx ( 447405 )
    Here is a link to a Wired article that talks about moletronics, but also specifically mentions applications of tiny computers. How about we equip planes with 10,000 microscopic black boxes instead of relying on just 1?

    http://www.wired.com/wired/archive/8.07/moletronic s.html [wired.com]
  • im curious as to how 99.8% stands up to the average electronic setup.

    I guess the next thing is to figure out dna error correction... think of the medical benefits of that one

  • by hooded1 ( 89250 ) on Wednesday November 21, 2001 @07:37PM (#2598659) Homepage
    Many of you have been complaining that .2% error is pretty bad, but there is a pretty damn easy way to fix this, just compute all the data twice, if you find that two bits don't match, calculate that bit again. Sure it halves the efficiency, but cosnidering how small they already are, and i assume, cheap, it doens't matter
  • by Dr. Zowie ( 109983 ) <slashdotNO@SPAMdeforest.org> on Wednesday November 21, 2001 @07:39PM (#2598665)
    Just what we need: a computer that's capable of making 20,000,000 mistakes per second, mixed in with 9,980,000,000 right answers.

    How do you tell which ones are which?

    • or averaging. if 2+2=4 10 million times, but 2+2=5 only 20 times, the system could compare... this kind of thing isnt my forte, but I imagine those with more practical computer architecture experience could tell you.
  • by istartedi ( 132515 ) on Wednesday November 21, 2001 @07:44PM (#2598686) Journal

    That's Nothing. The other night the star quarterback and the head cheerleader created a practical DNA computer in the back of his Chevy pickup.

  • Hmm DNA based computers hey...

    I can see it now:

    A couple of geeks at a network game session comparing their hardware. And then one of them yells out "You reckon that's good! Check out this puppy!"

    And then his PC is ACTUALLY a puppy but with like a USB port and stuff poking out all over it.

    I don't know why, but that would be awesome!

    :)
  • by Ratcrow ( 181400 ) on Wednesday November 21, 2001 @07:50PM (#2598711) Homepage
    It's my understanding that all they are doing is allowing molecules to combine into a tremendous number of configurations, then filtering out the ones that don't have the characteristics they'd expect from a solution to a particular problem. Then they just verify the shape of the structure of the remaining molecules. It's only slightly more sophisticated than having a trillion monkeys typing on a trillian keyboards (except in this case, they know when a monkey is close to the answer they want).

    It might be possible to solve NP-complete problems in this fashion (i.e. is there a hamiltonian circuit containing N vertices in this molecule's structure), but the amount of time and effort needed to set up the system and filter out the results does not seem worthwhile. Further, this requires that they already know what kind of structure they expect as an answer (in order to filter it out from the rest), so it will only work on problems where they already have a good guess about the answer. Not something you can expect to see as a general problem-solver.

    In otherwords, I don't expect to see Apache running on this anytime, ever. Might be interesting for conjecture, but my money's on quantum computing for this kind of problem solving (at least q-bits have a chance of being interfaced with existing computer hardware).
  • Click here. [yahoo.com]

    That's it, mod me up, you can do it.
  • Too many people are saying this "computer" will make 20,000,000 mistakes per second. Rather than thinking of it as a computer, why not think of it as an artificial brain. Your brain certainly makes mistakes. Why should an artificial one be any better?

  • by Man of E ( 531031 ) <i.have@no.email.com> on Wednesday November 21, 2001 @11:47PM (#2599269)
    Israeli scientists have built a DNA computer so tiny that a trillion of them could fit in a test tube

    Wow, just imagine a trillion Israeli scienists in a test tube. It's a snug fit, but in such close proximity, they still perform a billion operations per second!
    I think we should build another DNA computer and put a whole international consortium of scientists into it! Just imagine the results.

  • "Since we don't know how to effectively modify these machines or create new ones just yet, the trick is to find naturally existing machines that, when combined, can be steered to actually compute."

    This sounds more like learning to control chemical reactions than building computers! They used an existing "computer", they didn't build it.
  • But then again, Matt Drudge is a professional investigative reporter. Slashdot gets most of its stuff from its users as the investigative reporters serve it up ... so fair is fair.

    However, I posted my comments on the issue hours ago [neotope.com], and I would like to place them here for the sake of, um, conversation in a more communal setting than a personal weblog:

    The beginning of the end of life as we know it [yahoo.com] is approaching. "Israeli scientists have built a DNA computer so tiny that a trillion of them could fit in a test tube and perform a billion operations per second with 99.8% accuracy." 99.8% accuracy equates to 499 accurate out of 500 total, or 1 error per 500 chances. With one billion operations per second, that's two million errors per second. So not only are we thrilled for this great new science, but we are thrilled at something that can potentially - at best - make only two million (2,000,000!) errors per second! In a test tube!

    Their presupposition is that DNA computers have the potential to be much faster and to store much more data: "DNA can hold more information in a cubic centimeter than a trillion CDs...giving it massive memory capability that scientists are only just beginning to tap into." Professor Ehud Shapiro adds,

    The living cell contains incredible molecular machines that manipulate information-encoding molecules such as DNA and RNA in ways that are fundamentally very similar to computation...Since we don't know how to effectively modify these machines or create new ones just yet, the trick is to find naturally existing machines that, when combined, can be steered to actually compute.

    What do I think? I think that such technology in the wrong hands will lead to the manipulation of human DNA and potentially all new forms of crime, terrorism, etc. Of course, in the right hands, this developing technology has enormous potential. My comment about "two million errors per second" was more in jest than anything; no technology is perfect upon its initial realization.

    Remember the movie Johnny Mnemonic [imdb.com] in which Keanu Reeves is a data courier using his brain as a storage device? The Terminator [imdb.com] also comes to mind, having a computer chip for a heart and futuristic storage devices for a brain. I like the idea of upgradable memory that never fails me, but what computer device is absolutely perfect? My verdict: I don't like it. Despite the obvious advantages, there are too many wildcards and unknowns at this point.

  • from a CS perspective, this does NOT solve NP.

    why? because you switch from an exponential time brute-force method to an exponential cpu-number brute force method.

    and practically, there's a limit to the number of molecules you can use.

    so the issue is not CS one: it means you have a much higher n in which the problem starts being impracticle.

    e.g. you will probably need a cipher the size of a DNA molecule for your future PGP (no, wankers of the world, your own is not good enough, since 99% is like any othres' :) )
  • Although the error rate seems rather high (0.2%) for a computer, there are all sorts of things that could be done to combat this. Someone mentioned something about CD-ROM read errors, and I'd like to expand it. On a data CD-ROM, well over half of each CD-ROM sector is used for error correcting code. Thus, CD-ROMs make lots of errors, but they're fixed before they get to the computer. Also, many uses of computers can handle the occasional error. Visualization programs could benifet greatly from the increased speed, and any uncaught errors would simpy be seen as the occasional visual defect. If the error rate is brought down enough (as it would be with good error checking) a human observer wouldn't even notice the rare glitch. Similarly, scientific simulations, which already take into account the somewhat random nature of physics, could deal with simulation errors the same way they deal with instrumentation errors: through repeated trials and finding trends.
  • It's possible that the rate of accuracy could greatly be increased if different scientists/programmers/what have you were to undertake the task. Given that both the 'hardware' and 'software' of this project was figuratively 'programmed' (are we going to need to invent new terms for this type of computer?), I suspect that the error rate could be decreased by more development and/or testing.

"God is a comedian playing to an audience too afraid to laugh." - Voltaire

Working...