Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Science

UCLA Chemists Progress Toward Molecular Computers 51

Concepvelo writes "It is very refeshing to see Professor Stoddart (my organic chem professor last quarter) and Pat Collier making progress toward molecular computers. Stoddart's team has created molecules that can be switched hundreds of times, where before they could only be switched once. They are saying that the creation of Molecular RAM is one step closer because all of this can be done at room temp. The article is here."
This discussion has been archived. No new comments can be posted.

UCLA Chemists Progress Toward Molecular Computers

Comments Filter:
  • article talks about working at room temperature -- it sounds like if the temperature changed, the computer would stop working, or might even lose data.

    And this doesn't happen to existing computers? We're putting so much energy into such a little space that the heat is phenomenal. 50 watts of heat over an tiny little 50mm x 50mm cube? That's incredible. If the case temperature rises above about 90 or so, most systems get /very/ unstable. Most modern computers are (or should be!) kept below room temperature - at around 68-72.

    At the rate we are accelerating at, radical new solutions will need to be devised to keep up with heat output of newer chips.

    Personally, I think a molecular computer would be easier to manage than our own. If it were possible to assemble a system that could function at -20C, I would try to get on-board their project to help with the HVAC equipment... it would be more energy efficient in the long run.

  • " ... a realistic memory cell needs to switch billions, if not trillions of times during its expected life."

    yes but 1 compared to 100 is 10^2. We're talking about advancing technology exponentially.

    I don't think it will be long before these are useful. A digital camera for example would be able to make use of memory chips with that existing technology. 300 Picture Packs of Ram -- once a process is introduced and prices come down, it may not even be necessary to improve the technology - though I'm sure it will be.

  • please elaborate
  • by smoondog ( 85133 ) on Saturday August 19, 2000 @05:03PM (#842641)
    These are awesome developments for both molecular engineering and the high tech. Remember, though, that many new developments never see commercialization, because they just can't be made faster, cheaper, better. There are so many dead technologies that were promising once. Memory such as 3d protein gels (read and written with lasers), holograms and others just haven't seen their full potential, and possibly never will.
    -- Moondog
  • do you know that the principles are not changing? I would not be so sure, the digital system of on-off itself may be improved here..
  • You meausure it in terms of storage space now right? Why the hell would anyone measure ram with moles? You don't buy your processor or your computer by the ounce or the gram do you?
  • Well actually hard disks and similar technologies should be worked on if that's the criterion because they are even slower than ram (buy 100x or more).
  • Well, there are humans and there are trolls...
  • The site is bad. Check out a mirror over at this site [johncglass.com]
  • I think they are talking about HP's 'learning' technology which allows them to take chips with 'dead' portions and have the logic discover those dead parts and work around them. I think it was called something like the Terramac? Saw an article a few years ago about HP putting a bunch of their broken/out-of-spec PA processors into this machine, and then a piece of software mapped out regions that had problems. This would fundamentally help molecular computing as molecular-level manipulations are going to be sensitive to things such as cosmic radiation pushing molecules into undesirable states. Software that can route instructions around such problems will be a cornerstone for such technology development. Also I believe the purpose of pairing these two technologies was to achieve some sort of 'shake-n-bake' processor development. ie, put in a bunch of switches and carbon nano-tube 'wires'.. shake it up, and then use this piece of software to discover a way for processing to take place. This would allow them to develop the technology without requiring cheap nano-manipulation. Other attempts at this are being made with self-assembling systems/organic systems. Even if this technology was perfected and cost effective now, I don't think it's in the interest of *any* of the powers that be to release such a disruptive technology immediately. This would completely gut the chip industry.... So who knows.
  • you're a tard.
  • sorry, that joke pretty much targets windows, pretty much all the time.
  • As those of who work with computers know, memory is only part of a computer.

    Actually, if the memory is fast enough, you could use individual memory bits as logic gates quite easily - which means that the only thing for a computer which WOULDN'T be memory, will be the connections inside the chip & the connections to the external world.

  • Ahh, silicone transistor technology...someone notify pamela anderson: she is soon to be the fastest and most powerful computer known to man.
  • You're just lucky, not better at doing anything... I bought a P3-600E recently.. When I got it setup I noticed the temperature was in the 80s (Celcius) from the on-die sensor. This was without tweaking, with the regular fan, running properly, in an open case. This was *way* too hot, the CPU warning sensor from the motherboard went off every now and then which didn't make me feel it was terribly safe. So I went out and got a good fan, an Alpha PEP66, then the temperature rarely went over 38C, which is high, but is fairly normal considering the on-die sensors are a bit higher. So I overclocked... My CPU wasn't stable with the retail Intel fan/heatsink at its rated speed. But with a good cooler, not only was it stable at its rated speed, it was able to run at 800... I actually got it a bit faster but dropped it back to keep everything in spec considering I've got some cheap PCI cards. :( So, your stuff isn't overclocking just because its all older and doesn't draw as much power into as small an area, not because your reluctance to overclock actually helps it.
  • You may be right about EE (though, there are areas in Electrical Engineering they will move to) but CS will be in less demand? Who is going to write software for these wonderful gadgets? Mechanical engineers?? LOL!

    rLowe

  • Im with you bro. Now prepare to be assimilated.
  • I would assume that you replace your oil regularly to prevent engine failure.
    Change my oil? No thanks.

    --
  • Pulling my head out of my butt, let me correct myself: Currently CPUs are on the order of ten times as fast as the MEMORY.
  • Yes, they are an order of magnitude or two slower, but they aren't a critical, first-order resource. Disks are "secondary storage," remember. (I know it sounds weird to think of them like that, but it's true.)
  • At the risk of responding to a big-time troll, here I go...
    Why don't we concentrate on actually making the most of our current software/hardware before we decide to make bigger and better stuff first
    By your logic researchers should never have tried to find any tech better than vacuum tubes, in which case you'd never be able to post on Slashdot. You say vacuum tube computers were for the super rich... not really true, they were very complex and not capable of being mass-manufactured at the time (each computer of the time being the size of a small house) and consumed literally hundreds of megawatts of electricity, so they were only for the institutions that felt they held enough promise of usefulness in the future. They were initially purely for research (research into computing science mind you, not research into other things utilizing computers). You should always have visionaries looking into the future and dreaming up extremely long-term future tech whilst others are working on gradual improvements to current tech. Without such a balance society would never have looked into gasoline (internal combustion) engines, maybe we would have just constantly tweaked and refined the horse-and-buggy setup! And definitely never would have invented airplanes or dirigibles - why not perfect the automobile before worrying about such nonsense as contraptions that can travel through the air?
    In theory what we have now is good enough. You don't see rocket powered cars or personal helicopers in mass production and in wide use do you?
    Ignoring the fact that the two technologies you mention are even less practical than our current automobile-based existence, I personally do need more powerful hardware if I'm ever going to see a robotic butler/maid in my house while I'm off travelling to some far-away cosmos! If you are happy eating your raw meat and grains and bathing in cold water, fine, but I'm still going to use the Plasma-Stove(tm).
    And how do you know that the interface of the future isn't going to be one that comes as a result of new hardware tech that some engineer dreams up? I can envision a helmet-like device that reads neural activity in my brain (patent pending in too many countries to list - IOW piss off I thought it up first) and types stuff on the page so I don't have to sit here for 5 mins and respond to your post, correcting all my damned typos! But that interface is dependent on some hardware whose technology isn't available yet. And maybe we need smaller, more powerful processors to embed into the helmet to decode all of the neural activity in my head (believe me it's a hotbed of synaptic storms)... you getting the picture?

    ----
  • As those of who work with computers know, memory is only part of a computer. Yes, an important part, but only a part. As the article points out, getting these molecular switches to work is only part of memory for that matter. Before we can will be seeing NanoRam (NRAM)TM on the store shelves, they have to figure out how to wire lots of these little beasties together. But it would be nice to have Tera-bytes of power in a quarter sized computer.

    Another problem I see with this early version of memory is that the individual cells can be switched "hundreds of times". Better than once, yes, but a realistic memory cell needs to switch billions, if not trillions of times during its expected life. We have a way to go here.


    Gonzo
  • by Chemical Serenity ( 1324 ) on Saturday August 19, 2000 @03:07PM (#842660) Homepage Journal
    Heh... with the numbers of chipsets and sockets et al that motherboards in circulation today have, we 're almost at that point already. ;)

    --
    rickf@transpect.SPAM-B-GONE.net (remove the SPAM-B-GONE bit)
  • by Dr. Zed ( 222961 ) on Saturday August 19, 2000 @03:18PM (#842661)

    The author of the article wrote:

    ....now they have done so hundreds of times.

    This doesn't necessarily mean that they stop working after that point. It's reasonable to assume that a molecular device that can perform a repetative function can continue to repeat that function.

    If I were to say that I've rebooting my computer hundereds of times, you wouldn't assume that I can no longer boot it.

    Also, it helps to pay closer heed to the quoted source than the authors text. The author quoted:

    "....they may be repeatedly switched on and off over reasonably long periods of time in a solid-state device under normal laboratory conditions. For the first time, we are able to turn the molecular switches on and off repeatedly."

    Molecules don't break down from wear in quite the same manner as larger scale components. Assuming that such switches are properly housed, there lifespan would be affected by things like changing environmental conditions (EM, temp, etc.) but not repetative use.

    The part I objected to is the following:

    The UCLA groups, in collaboration with Hewlett-Packard researchers, are working on making molecular computers that may "learn and improve the more they are used," Heath said.

    This just doesn't ring true. First, they are far from actually developing a "molecular computer". How could they be working on one that does anything, let alone a learning one.

    This sounds like hype... call it "text-candy" for lack of a better term. The principles for bullding a computer are not really changing here (switches, logic gates, etc.) so developing a "learning molecular computer" has two steps:

    • Develop a molecular computer
    • Develop learning software/hardware

    They are currently on step one....

  • by sporty ( 27564 ) on Saturday August 19, 2000 @03:21PM (#842662) Homepage
    Wouldn't molecular computers be a lot less stable than our conventional computers?

    Ok, you begged the response:

    Only if you were running windows/linux/freebsd/dos/os2.... </joke>

    ---

  • by Kiro ( 220724 )
    Image if the molecular RAM developped Alzheimer!

    --
    Kiro
  • by Docrates ( 148350 ) on Saturday August 19, 2000 @03:36PM (#842664) Homepage
    One thing I noticed about the article is that the term "nanotechnology" or "nanocomputers" wasn't mentioned once.

    The closer they got was "nano-sized computers" and an institute called nanosystems.

    I don't think this is coincidence. After a lot of media coverage of stories like Bill Joy's paper [wired.com] (which i read and liked very much, but didn't totally agree with) the term nanotechnology has been made to have a negative ring to it.

    then you have all the nonesense far less inteligent people have said, plus some very cool renditions like Deus Ex, and what you have is instant holocaust. soon nanotechnology will sound as bad as nuclear reactors and genetic engineering to a lot of people (they all sound fine to me).

    anyhow, congrats to the research team.
  • by freddie ( 2935 ) on Saturday August 19, 2000 @04:20PM (#842665)
    This kind of research sounds like a waste of time -- given how fast conventional processors are accelerating (Moore's Law and all), do we even need molecular computing?

    Well of course we do. As things get smaller and smaller it's going to get to individual atoms and molecules anyways.

    Even if the current silicone transistor technology, can be scaled much further down. It will still take up a lot more molecules than this (possibly in the millions), simply because it's a technology that is not design to work with individual molecules.

  • Why don't we concentrate on actually making the most of our current software/hardware before we decide to make bigger and better stuff first.
    Or standardize on some type of interface for the future and then work on getting stuff to fit into those interfaces?
    Oh and in theory because of things like mass distribution it's supposed to drive price down and the like. I don't see my computer that is as ubiquetous as say a bicycle or something unless it's crippled or is used.
    Why do you need hardware to constantly increase in power anyway? In theory what we have now is good enough. You don't see rocket powered cars or personal helicopers in mass production and in wide use do you?
  • It seems that most of the things we "need" are really just stupid brainless companies taking our hard earned dollars away from us using glitzy visions of tomorrow that we really don't need nor actually use.
    You no that most of the actual things that people use their computers for could be fulfilled with QNX and a 486 right?
    Get rid of the bloat and the rest will follow. People should ask themselves critically " DO I really need to be a pawn in a rich man's game?".
  • I don't mess around with the jumper settings on my motherboard and keep it at the manufacturers recommended temp for my processor.
    Also I think that most modern hardware is really trash compared to the old stuff I thave (seriously). Take my comptuers all made from components that were in the 386/486 class range and not one hardware failure in 3-5 years and they were used! Now I hear about people ruining their motherboards by messing around with them, having constant hard disk failures, memory parity errors, dieing ram, etc.
    My computer has been exposed to temps within my house I was sure were above 110 and my machine didn't crash or fail in any way (running Debian and MS-DOS). Not a problem in sight. I don't have to dip my computer in liquid nitrogen or anything. In fact I have never actually seen a machine crash because of overheating in any way before.
    Cooling is the least of my worries as far as my computer.
  • This kind of research sounds like a waste of time -- given how fast conventional processors are accelerating (Moore's Law and all), do we even need molecular computing?

    This is probably a troll but what the heck, right?

    Moore's Law has a physical limit, eventually (if we want to keep it going) we've got to move the essential elements of computers to the smallest levels possible. We'll need molecular computing when when current silicon-based techniques hit their limit.

    By your logic we should never have moved away from vacuum tubes. ;)
  • Bill Joy is a technoloon; not exactly a new thing for the internet to have either I might add.
    The chances that anyone will be able to do anything like that are at least 9.4562334x10^135 to one and that's generous.
    Hell we havn't even landed a man on mars, created a permanent moon base, developed superinteligent cyborgs or done virtually any of the things that scifi pundits have claimed should have been done.
    Let Picard and the boys worry about the evil nanites for now and when we get there (maybe around the 45th century) we'll have a new look ok?
  • When I first saw this article I thought it was about quantnum computers. Instead its about regular computers that work on a molecular level. The article further said computers, not memory, which added to the confusion.

    Was anyone else confused by this?
  • We get to spend trillions of dollars just to go back to analog again gee thanks.
  • I dunno, but maybe they're abandoning the classic Von Neuman architecture here, and heading towards something more like a neural net. Such an architecture might be more efficient for the interaction between molecules.
  • The reference to working at room temperature refers to the fact that this stuff was previously [rice.edu] only demonstrated at very cold temperatures (60 K or -350F). This is a significant achievement towards getting it stable at even higher temperatures. Not quite unlike not so long ago when computers had to be placed in air-conditioned rooms.

    given how fast conventional processors are accelerating (Moore's Law and all), do we even need molecular computing?

    Moore's law says nothing about the actual technology used to achieve the gains. This might very well be what we need in order to keep with Moore's law once the current theoretical limits on conventional technology are neared.

  • by Anonymous Coward

    This is NOT a joke! Click here: http://britneyspears.ac/lasers.htm [britneyspears.ac]

  • Wouldn't molecular computers be a lot less stable than our conventional computers? The article talks about working at room temperature -- it sounds like if the temperature changed, the computer would stop working, or might even lose data. Not only is this a pain to work with it, it would be easy to sabotage someone else's computer -- just heat it up or cool it, and, presto, it stops working!

    How is this any different from conventional technology? If you increase the tempurature of a CPU sufficiently, it will begin to malfunction or not even function at all (this is often encountered by overclockers). Molecular computers might be more vulnerable to this, but I don't think it's that big of a deal. Just get a cooling/heating system which always adjusts the equipment to the correct temps, or simply turns everything off if it is too hot or too cold.

  • See thats the main difference. Realistically all the easy inventions have already been taken and done away with from about the 1970's on for stuff that Jimmy could easily get away with in his home.
    Now getting that new invention takes trillions and a research lab full of people who are all extremely over educated.
    And by the way it's called "inventor" instead of "researcher" because inventing can be done by the average man and "research" takes a hell of a lot more.
    So actually employing logic we can see that this is in fact a waste of time.
    1. The longer something takes the more likely that failure will occur.
    a. exaustive checking usually means that the object of study isn't something that can easily be analyzed and therefore less likely to be found.
    2. Multiple doing a process proves it even faster. a. taking 1 and 1a we see that therefore this is less likely to be found.
    3. Computing power is purely sufficient now to take care of current noncommercial needs or that of people who have been shall we say "convinced" (or more properly coerced) to buy crap.
    4. Better more efficient software/hardware will result from continued development on current gen stuff.
    5. In conclusion this is a waste of time.
  • Yeah, that's a good point. My bad. I just kinda think that lower level languages will fundamentally change in the way that they deal with and store information. In addition, the limited memory of single nanosites (to use a Stephenson word) means that distributed computing would reach a new level of imporance, and tieing-together systems like Jini will need to be reworked.

    In addition, I/O will be a bitch.
  • by iElucidate ( 67873 ) on Saturday August 19, 2000 @08:24PM (#842679) Homepage
    My computer is currently not powerful enough to do real time voice and video processing, determine from my tone and facial movements what I want the computer to do, or providing intelligent assistance, or doing even rudimentary AI tasks. All home computers do is flashy graphics - I want the power to HELP me in something IMPORTANT. I want the ability to havea thousands of nano-sized devices implanted in my skin to store critical information. I want my own Personal Area Network (PAN) that is completely secure and stores a record of all my experiences, emotions, health readings, and physical stimulus. I want a complete, real-time searchable, intelligent database system that monitors what I do and prompts me with intelligent and relevent information to help me with the task at hand.

    Do we have the processing power to do this? Yes. Could I fit in my pocket? No. That is why this is important - every step towards nano-scale technology is a leap forward towards a future of prevelant, pervasive technology that is much more flexible than what is currently available.

    THAT is why this interests me.
  • It sounds like it'll be a long time before this kind of technology is available mainstream. But just imagine if they can synchronize the timing and release quantum computers with molecular RAM. Simply wow.
  • I messed with your Mother's building blocks last night.

    Sorry, couldnt resist.

  • by Cyclone66 ( 217347 ) on Saturday August 19, 2000 @02:32PM (#842682) Homepage Journal
    Great, now biological and technological warfare can converge :) Nerve gas won't just be for humans! *lol*
  • shit. then we'll need to know stoichiometry in order to match dimm pairs on the motherboard!
  • by iElucidate ( 67873 ) on Saturday August 19, 2000 @02:58PM (#842684) Homepage
    I was sure that Eric et. al. of the Foresight Institute [foresight.org] had already designed and built molecular switches. In addition, the recent experimentation on buckyballs [everything2.com] may herald the way toward better switching and gears.

    I have this theory that with nano, EE/CS will become in less demand, and mechanical engineers will be forced to reexamine rod logic if they want the good jobs. However, we're really moving at a snail's pace here, and haven't had any real developments in a while. nano-saxaphones [mindwire.org] for Pres. Clinton! ;-)

  • Another problem I see with this early version of memory is that the individual cells can be switched "hundreds of times". Better than once, yes, but a realistic memory cell needs to switch billions, if not trillions of times during its expected life. We have a way to go here.

    True: that's also the reason we don't use flash-memory as RAM, it becomes unreliable after that many switches but would give the advantage of not needing to be refreshed. (would take away the need to boot your computer)

    But maybe that's old thinking: what if the individual cells become so abundant you just discard them and use others after a hundred switches? These are molecules, their production is not comparable to present-day RAM-transistors. A new "memory-paradigm" might be necessary: other ways of organizing and working with memory. I don't think the RAM/HD config would still be relevant with this technology.

    Anyway, another good article on molecular electronics (not only about memory though) from Wired:

    http://www.wired.com/wired/archive/8.07/moletronic s_pr.html

  • by Chemical Serenity ( 1324 ) on Saturday August 19, 2000 @02:38PM (#842686) Homepage Journal
    I wonder how long it'll be 'til we start buying memory by the Mol instead of the Meg.

    (Yes, I'll take a couple of those 6.0225x10^23 SIMMs, please...)

    --
    rickf@transpect.SPAM-B-GONE.net (remove the SPAM-B-GONE bit)

  • by devphil ( 51341 ) on Saturday August 19, 2000 @02:59PM (#842687) Homepage

    I agree with you, I just want to expand on something:

    ...memory is only part of a computer. Yes, an important part, but only a part.

    And it's the part that can benefit the most from this kind of thing. Not for size, but for speed. The speed of memory access is often the controlling factor in how fast a program runs. It doesn't matter if you just bought a 6000 terahertz CPU; if your memory is slow, your processor only waits more.

    Currently CPUs are on the order of ten times as fast as the processor, and the gap is increasing. We need faster memory more than we need faster anything else. (Well, maybe faster pizza delivery.)

  • by vertical-limit ( 207715 ) on Saturday August 19, 2000 @03:01PM (#842688)
    Wouldn't molecular computers be a lot less stable than our conventional computers? The article talks about working at room temperature -- it sounds like if the temperature changed, the computer would stop working, or might even lose data. Not only is this a pain to work with it, it would be easy to sabotage someone else's computer -- just heat it up or cool it, and, presto, it stops working!

    There just seem to be a whole lot of risks involved with molecular computing. Wouldn't it be easy for the molecules to be jarred out of position? If a part of the computer breaks, would all the molecules inside be lost? This kind of research sounds like a waste of time -- given how fast conventional processors are accelerating (Moore's Law and all), do we even need molecular computing?

"One lawyer can steal more than a hundred men with guns." -- The Godfather

Working...