Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Science

Bulk Technology Might Produce Molecular Computers 75

PerlDiver writes "Researchers for UCLA and Hewlett-Packard have announced the creation of molecular logic gates utilizing rotoxane. " Consider this to be my little touch of nanotechnology today.
This discussion has been archived. No new comments can be posted.

Bulk Technology Might Produce Molecular Computers

Comments Filter:
  • Sounds like hype with no details how these crystals differ from silicon crystals or its electrical charges are different from CMOS electrical charges. Sounds to me like they just have developed a new semiconductor based on a different crystal structure. Imune to computer viruses, crashes, and other glitches? That sounds like more of a quality of software programming. You can program with hardware, aka PALs and such, but one still needs to program sensible logic, such as disallowing macros in documents, granting root permissions for everyone to change the system, etc.

    Hype, hype, hype, no juicy details.
  • I'm a bit disappointed that people's vision of the future includes printers.
    --
  • I think I get what they're driving at. Concentrate on the "...permanently, doing away with the need to erase files" part. They seem to be talking about append-only storage. Using such storage you can presumably design a safer system, one where a virus can't insert itself into a preexisting binary because it's read-only, where hackers and absent-minded sysadmins can't erase or alter the system (think logs, specifically), where a crash can't corrupt the filesystem. Any vast storage device makes this practical, regardless of what technology is used to implement the vast storage.

    The tricky part would be to safely manage overrides. You'd want to alter files by writing a new one that overrides the old one, but then you've got some of your old problems again.

    Has this kind of thing been studied at all? It looks interesting.
    --

  • The short answer is, "there is no limit". By this I mean, the real question should be "how many calculations per second could be done", or "how much information could be processed in x amount of time". Of course, the speed of light limits the speed at which different parts of a system can communicate with each other (ignoring quantum nonlocal communication methods, which thus far are in the realm of sci fi). As you hinted, though, parallel systems get around these constraints. Quantum computers (for example) get their power from the "massively parallel" computing methods they use -- IOW, they perform many thousands or millions of calculations simultaneously, which now must be done using far less efficient methods.

    Of course, we may not be able to approach all forms of information processing in this way.

    There are many factors that limit a computer's speed. Saying that the speed of light is one of them is one way to look at it. I prefer to think of the problem as one of gate/transistor density. Of course, if one could go faster than light, then density wouldn't matter much (though switching speed certainly would). The real fundamental limits we're coming to are more along the lines of what makes MOSFET transistors work. The smaller the transistor, the greater the leakage current if threshold voltage is scaled respectively. But keeping threshold voltage high results in reduced switching speed performance, unless you keep power supply voltage high -- a definite problem when dealing with such small devices.

    I think "Moore's Law" will eventually be broken -- but the other way around. I think we've got a way to go before the limits of current technology are exhausted, and look at what we've got in the pipe. Silicon-Germanium, HEMT and HBT transistor designs, copper interconnect, silicon-on-insulator and multi-value computing will help extend the limits of today's designs. Advanced photonics is pretty much here, now, and waiting to be exploited on a large scale. Quantum, molecular and DNA computing are advancing by leaps and bounds, and hints of other things, such as "reconfigurable computing" also make for an exciting future.

    Things have followed Moore's Law primarily for economic reasons. Once the above become economically feasible, we'll see not just linear advancements, but revolutions in computing power.

    Kythe
    (Remove "x"'s from

  • This is absolutely correct -- and is a main driving force behind much of the research behind direct human-computer interfaces.

    Kythe
    (Remove "x"'s from
  • Exciting? No!

    Actually, quantum encryption beats quantum computing. You can't tap the data stream without collapsing the wave function.

    Kythe
    (Remove "x"'s from

  • Uh... This is Yahoo, remember. :-/

    Does anyone else get the feeling the person who wrote this article was totally CLUELESS with regard to computer technology? No viruses? No crashes? Sort of like saying that, if you make the computer complex enough, it'll never have any bugs.

  • This is really directed towards the quantum physics geeks out there...

    What is the theoretical maximum speed for a computer? I mean, we can't go faster than the speed of light - and that's quickly being approached. Where a single electron can mean off or on... the speed of a computer will be limited solely by the speed of light in the near future. Which means massive parallelism will be the *only* way to go faster. Moore's law will be broken eventually... I wonder when.



    --
  • One limit is round-trip signal time, which for a clocked architecture (rather than asynchronous) imposes a maximum speed-of-light limit on clock rates in the GHz for reasonable sized chips (or presumably collections of molecular processors). Asynchronous logic isn't limited this way, but has its own problems.

    Massive parallelism doesn't beat Moore's law, because Moore's law is an exponential increase, and parallelism gives you only a linear increase with # of processors at best. Once we run into the miniaturization wall (and we will, eventually) we would need to build exponentially bigger and more power-hungry computers to keep up with Moore's law.

    Quantum computers are one solution, since they operate in an exponentially larger mathematical space than their physical resources (n quantum bits = 2^n classical bits), but no-one has yet built a practical quantum computer, and this may turn out to be hard. Great for cryptanalysis, though... :)
  • As far as I can tell crash/virus immunity is pure nonsense -- In a formal sense, a computer that can't crash would need to solve the halting problem (or else how would it know that the computer had crashed or was simply taking a long time to execute an algorithm) which is provably impossible. Likewise, a virus is simply another type of program. In order to make a machine completely immune to viruses you have to make it immune to programming, and you don't have a computer any more.

    I'm guessing that they meant that if you have millions of processors, crashing/hanging a few doesn't make a difference. I don't buy this, because having all the processors in the world doesn't make a difference if your code won't run reliably on one. Put another way, sucky software still sucks no matter how many computers you run it on. Likewise, I could easily imagine a virus spreading through the entire molecular computer and rendering all of those millions of processors useless.

    The small size and low-power consumption of molecular computers would be great, but there is a long way to go from building a few molecular logic gates in a laboratory and constructing a real machine capable of doing interesting work. My guess is that etched chips are here to stay for the next ten years or so, but if someone hands me a working molecular computer tomorrow I'll start work on the Linux port right away. :)

  • I've seen it too.

    Advanced Tea Substitute. Can't beat Brownian motion for powering an Infinite Improbability Generator!

    --
    QDMerge [rmci.net] -- data + templates = documents.
  • When we get nano tech computers ready for mass production, the fastest super computers today will look ridiculously under powered. I hope everyone with an interest in nano tech has read Diamond Age by Stephenson... an entertaining view into the posibilites of the technology.
  • We can't have computers that small because how can we work with them? Where do you attach a keyboard?

    Not all computers are personal computers. Not all computers need direct I/O devices hooked up to 'em. You're not going to need a monitor and keyboard hooked up to every node of.. say... a beowulf cluster.

    The diminutive size opens up possibilities for *other* uses for computers. Smaller robots, tiny regulatory devices in people's bodies to perform any number of functions to keep a body running better, to name a few.

    Even considering computers that *would* be for "personal use", think of how slimmer your laptop would be if the *only* bulk was for the keyboard and screen. Think of faster, more versitile PDAs. Heck. Think of a laptop containing an entire *network*. Or cluster. Or what have you.

    There's certainly no cause for saying the possibilities of such development are limited by some need for big bulky things. If you want bulk, the smaller components mean you can cram more of 'em in there.

  • by WillWare ( 11935 ) on Friday July 16, 1999 @04:49AM (#1799126) Homepage Journal
    read Eric Drexler's "Engines of Creation" (full text online)
    Engines of Creation is here [foresight.org]. Another good book, a somewhat breezier read, is Unbounding the Future [foresight.org].

    build a nanocomputer ... and you're no less prone to bugs and viruses
    That's certainly true. A computer involves many layers of abstraction, with logic gates near the bottom and operating systems and applications near the top. The article appears to be describing an innovation at the gate level. Desirable to be sure, but it is unlikely to change the computer at an architectural level.

    I might be wrong about that. I went to a talk on reversible computing [std.com], which you'd think would have relevance only at the lowest levels of abstraction. It ends up having ramifications all the way up, if you want to implement reversibility completely. (We can probably get almost all the benefit of reversibility with incomplete implementations.)

  • The void look in my PHB's eyes..

    That bead of drool lowly rolling down his chin in staff meetings.

    The PHB has been slashdotted!!
  • Umnh.. I'm remembering properly. I'm just not sure I'm believing. If I understand correctly the "structure of the universe" is sufficiently deterministic that if you change the state of that quantum, then WHENEVER the other is read, it will be in the appropriately correlated state. This is true even if it happens "before" you make the change. (Of course, since in such a circumstance it would be outside of your light-cone, before had to be in quotes.)
  • Thing is, it's "faster" than instantaneous, but also although the change is correlated, it has not been proven that information can actually be transmitted. It's WIERD!!!
    Bell Theorum is one of the stranger offspring of the Einstein-Podowsky-Rosen attempt to invalidate quantum mechanics. (I *think* that Wheeler's Many-Worlds hypothesis is another, but this one is stranger anyway!)

    What does it mean to say that two things happen at the same time, when they are spatially separated? Think about this while you contemplate the instantaneous correlation "transmission".
  • For the real story, read Eric Drexler's "Engines of Creation" (full text online at www.foresight.org somewhere) and for a genuine technical treatise try "Nanosystems" by the same author. Diamond Age is a sci-fi novel, and while it might awaken you to the possibilities, you're far better off with scientific fact.

    Btw, build a nanocomputer with the same architecture as a conventional one and you're no less prone to bugs and viruses - but the HP team has also designed the Teramac, a new architecture designed to "route around" failures.
  • The San Francisco Chronicle [sfgate.com] has a much better article [sfgate.com]. More technical details are toward the end.

    Interesting: in the print edition, this was the lead article, page one, above the fold, top right. Also, there was a decent graphic (which I can't find online) accompanying the article.

  • ... one of the researchers has a homepage [ucla.edu] with some seriously detailed background information on the technology involved. Drill down to the research index. An interview with one of the HP scientists is here [hp.com]. Anybody else notice all these people are from Rice?
  • Sorry. Just wanted to get that straight.
    #include "disclaim.h"
    "All the best people in life seem to like LINUX." - Steve Wozniak
  • First off, the title to this piece is essentially a tautology - bulk tech will have to be what leads us to nanotech, since bulk tech is all we had to start out with (unless you include natural bionanotech such as DNA - but there again, we discovered and until recently manipulated all such molecular-level structures with bulk tech).

    Second, the practical limits to computing power (i.e. bang for the buck) as opposed to speed (pure MIPS/GIPS (?) or what have you) are what most people are really concerned about. There are many factors - pure clock speed (GHz, THz (?)), instructions/clock cycle (e.g. Intel's EPIC - sort of MPP on a single chip, AFAIK), instruction length/information density (32, 64, 128... bits cram more data into each instruction). So what if you have a multi-THz processor, if it only runs 4-bit instructions through 1 register! (Of course, this could accurately describe DNA....) Theoretical limits are fine, and can be measured by the application of quantum physics to information theory. But again, the lower limits such as network speed, memory/storage speed and bus/crossbar speed will hamper our current architectures for the near future. Sorry if it seems like I'm pointing out the obvious....

    #include "disclaim.h"
    "All the best people in life seem to like LINUX." - Steve Wozniak
  • Have you ever read Eon by Greg Bear? It deals with that theme in detail. I highly recommend you pick it up some where it's a good Sci-Fi novel.
  • well, think of some nano-probes (that swim in your blood stream) that use their onboad computer to figure out if you are sick and then fix it. or if you are hurt they would then fix it. or if you are getting to old, fixed...


    nmarshall
    #include "standard_disclaimer.h"
    R.U. SIRIUS: THE ONLY POSSIBLE RESPONSE
  • ``We can potentially get the computational power of 100 workstations on the size of a grain of sand.''

    This remind anyone of computing 50 years ago?:
    "Well, possibly by the end of the millenium, computers will be so small, that you could fit them in to a *single room*, with room left over for a chair, and possibly even a teletype!"

    And what the hell is this?:
    "They will need far less power than current computers and may be able to hold vast amounts of data permanently, doing away with the need to erase files, and perhaps also be immune to computer viruses, crashes and other glitches."

    Where did that come from? Doesn't it just yank your chain when a journalist just throws crap like this in. Why must people prattle on about stuff they don't know?!

    Sigh...
  • Several million "processors" huh? So, what kind of bus does this thing run on? I guess it must be NUMA. And the bus lines are subatomic in size right? I mean, if each processor is a "particle on a molicule"...(what??)
    "Because no data is deleted, no heat is created"???? Where did you get that? Someone here (and it's not me) doesn't know how a traditional CPU works.
    Now I'm not saying that such a thing doesn't exist, some NMR computing device, but I'll bet you anything that it's a little different than you have explained, and that you can't use it for much more than predicting simple particle behaviour.

    P.S. Computer in a flask, HA! I have one right here on my desk, I call it coffee.
  • I think you're referring to Bell's Theory about two particles created together that speed apart but retain an instantaneous link between them. This has been proved, BTW, and it is instantaneous, not the speed of light.

    However, you've got it a bit wrong, and it's _really_ hard to explain, but here goes:

    The particles are created in such a way that we know they have opposite spins. Everything else about them may be the same, but the spins are opposite. Then they start moving apart.
    If I later measure one of the particles for spin, I "instantaneously" know the spin of the other particle, which could be light-years away by now. However, this is nothing special - just mere logic. The special part is that I can effect the particles spin by how I measure it. If I measure it in a certain way, I get a certain "type" of answer, and the wave-form collapses and all that jazz. The really interesting thing is that the wave-form for that light-years-away particle also collapses at just that instant, and it's spin becomes "fixed" to exactly opposite of the spin I measured on my local particle.

    Now I've confused myself.... so let me summarize what I definitely know:
    1. The transmission of "something" between these two particles is instantaneous - it's been proven in the lab.

    2. The best scientific minds have never figured a way to use this for purposes of communication, so although the ansible is a nice idea, it is a fictional plot device, not a theoretical possibility (at least, not yet).

    Maybe a real scientist can come by and say something on this?
  • They probably weren't thinking when they wrote that. Viruses exist in the data, so unless a good virus detection program were to run on it that could detect unknown virii (not possible), it's still susceptible. Crashes... ever try renaming a .txt or .bmp to .exe in win3.1 then try running it? it would crash with an out of memory error. maybe this will actually make it run (new programming theory: Bitmap PL). And yes, these things would be damn light, and powerful!
  • Agreed. I think that with computing power on this level, you may see something in the way of the Star Trek computers, with the visual/vocal/key input, and a central core type computer. Large businesses could have that, and maybe a laptop would have a silicon wafer with a grain or 2 implanted on it. Maybe a carbon wafer...
  • by TheIneffable ( 56926 ) on Friday July 16, 1999 @04:10AM (#1799142)
    Implanting nanotech computers in people is cool; that is not disputed. We do, however, need to look at the possible downsides, as these buggers might prove to have some rather dark side effects, especially because with computers that size, molecular manipulation is a viable peripheral option.
    E.g.:
    1. "Your trial period for WinZip is now over, and you have selected not to uninstall. Thank you for using WinZip, and please enjoy our complementary copy of eHerpes 5.0"
    2. "System resource conflict with HP SCSI mini-CD drive and Generic Liver."
    3. "Speak to me! You're alive, I know it! God, why did I install NT on my brother?"
    4. "Man, it's hell when you're in a job interview and you get some porn site's spam."
  • Who's to say they won't develope a display that can beam screen images directly onto your eye? That removes the monitor. There has already been talk for awhile about a virtual keyboard based on the position of your hands and fingers. Thus no keyboard, just a pair of gloves. I can see this technology getting to the point where you can use your computer anywhere. Carrying nothing but a pair of gloves. That would be cool.
  • I wonder if the experiments that they've done proving that light isn't always at a constant speed can figure into this somehow..? =)

    Shad
  • I may be misremembering, but I believe even this interaction occurs at the speed of light. The two particles (atoms, or electrons in the version of this thought experiment with which I am familiar) interact via virtual gauge bosons, which are force-carrying particles. And these particles travel at, you guessed it, the speed of light. If the particles interacted instantaneously, then I wonder if in fact they aren't really "separate" particles in the first place.

    Everything we know about physics tells us it is IMPOSSIBLE for ANY sort of communication to proceed faster than the speed of light. Even gravity propagates at this fixed rate.

    The idea is to build faster chips not by increasing the speed of signal transmission, but by decreasing the length of the signal path. Which is the entire point of this nanotech stuff anyway.

    As for whether there is a theoretical maximum limit on computational speed, I really doubt that. You can always go twice as fast by using two circuits in parallel, provided the job can be parallelized. In that case, then the rate of computation is limited only by the amount of material and energy in the universe necessary to build the computation modules.
  • Plank's constant describes the relationship between the energy of a photon and its wavelength. From this, a finite amount of time is derived, called the Plank time. This is the ultimate smallest discreet unit of time, the speed at which quantum fluctuations happen. It is something on the order of 5e-44. Now this is rather fast. However, this is only the speed limit for a single gate, assuming that it uses a single atom and changes a single quantum state. However, we run into more considerations with multiple gates and parallel processing. We also run into the problem of the bottlenecks of the system that runs it. In the final analysis, it is not how fast the processor can run, but how fast the surrounding equipment can run the processor.
  • If there's absolutely no way to delete files, then all a virus would have to do is create files until it filled your memory...
  • They will need far less power than current computers and may be able to hold vast amounts of data permanently, doing away with the need to erase files, and perhaps also be immune to computer viruses, crashes and other glitches.

    Does anyone else understand how this works? How are they immune to viruses and crashes? What operating system do they run? Can you not install any programs?

    Anyway, this will be a boon for mobile and wearable computer users!
  • Although I have not read the book, I can see the possibilites. The only caveat - when you have a computer the size of a grain of sand, you don't want to use it on the beach! We can't have computers that small because how can we work with them? Where do you attach a keyboard? We need big bulky machines we can lug around and use. The largest part is always the display and input/output devices - the printer, the monitor, the mouse, the keyboard. We need to look at new forms of input and output - and don't even mention speech. Who want's to only interact with their computer by talking??!!
  • Wow, a computer implanted in my eye! cool! Maybe I can use my fingernails for storage, and then delete things by getting out the clippers.

    Well, there certainly would be privacy inherant in this system!

    With all of this power, I'm going to abandon my body and upload my mind to a computer and live forever in the ether. Who's with me?
  • It's called an ansible, and it's described in the science fiction of Orson Scott Card ( Ender's Game) Pretty exciting, and even theoretical!
  • Don't even get into quantums! Quantam mechanics, while incredibly interesting, is also incredibly scary. Imagine being able to break any crypto instantaneously. No crypto is unbreakable because quantam computers can be built to an infinity dimensions, giving it infinity power, being only limited by how large you want to go. Artificial Intelligence will rule the earth, humans will be used as fuel, and the world will come to an end. :-) Exciting, no?
  • What is this, Bistro Math?
  • two words: shut. up. evil flamer - you trying to start a religious war? Go to college, you idiot. US$0.02
  • Hmm...in a flask? He he. I'm missing the thing about the heat, I'm worried about the 16 bits, i don't like the memory part, and I wonder what intoxicants can do to it, but, other than that... :-)
  • There have been some thought experiments where the speed of light and quantum mechanics have come into direct opposition. One example of this is if there an atom with two atoms in the same quantum state save for spin and the spin of one of them is known and then changed the spin of the other is changed instantly. Theoretically this is true no matter how far apart the two electrons are. This theoretically allows instantaneous transmission of information.
    bjg
  • I think what they are attempting to say is that it will have a prossessor like an FPGA (field programable gate array) in which logic gates can be reconnected on the fly and could be massively redundant allowing for a lot of fault tollerance. And that all the memory/disk/whatever will be like WORM (write once read many) where there would be no way to delete files, but because of the size and effecintcy this would be unimportant. If a virus infected such a system, no data could be lost and therefore virii would atleast have limited effect.
  • do you really think that the so-called "android" is the stuff of nightmares? If we could supplement human capabilities with internal as opposed to external machines (we already supplement human capabilities with external tools and machines) would this not be just giving people a set of on-board tools with which to operate? The possibilities are endless with this sort of tech. Think self diagnosis and healing: we could have an end to disease. Think information : We could even have an end to school. I don't think that we're anywhere near understanding how the mind works, so there is little chance of a machine replacing it. But there would be some pretty cool supplements to the human condition if we had an onboard IT environment to monitor and assist us. Not only that, by simply regulating the environment in which these applications will run, we will be able to steer the course of these developments to avoid the really scary stuff. There's nothing really scary about your desktop environment - Nanotech will reduce the need for us to be at work, which can only be a good thing.
  • That's what it's called by the way. And I haven't been able to find a reliable source that confirms whether it accurs at the speed of light, or instantly. Different places say different things :/
  • I say this because most of the pathetic education aroound here (excluding Japan and Europe) does not even know what the hell Nanotechnology is in the first place. This alone, renders college completely useless. I feel sorry for all of you in college who are wasting your dear sweet time.

Utility is when you have one telephone, luxury is when you have two, opulence is when you have three -- and paradise is when you have none. -- Doug Larson

Working...