UCLA Chemists Progress Toward Molecular Computers 51
Concepvelo writes "It is very refeshing to see Professor Stoddart (my organic chem professor last quarter) and Pat Collier making progress toward molecular computers. Stoddart's team has created molecules that can be switched hundreds of times, where before they could only be switched once. They are saying that the creation of Molecular RAM is one step closer because all of this can be done at room temp. The article is here."
Re:Is this really going to work? (Score:2)
And this doesn't happen to existing computers? We're putting so much energy into such a little space that the heat is phenomenal. 50 watts of heat over an tiny little 50mm x 50mm cube? That's incredible. If the case temperature rises above about 90 or so, most systems get /very/ unstable. Most modern computers are (or should be!) kept below room temperature - at around 68-72.
At the rate we are accelerating at, radical new solutions will need to be devised to keep up with heat output of newer chips.
Personally, I think a molecular computer would be easier to manage than our own. If it were possible to assemble a system that could function at -20C, I would try to get on-board their project to help with the HVAC equipment... it would be more energy efficient in the long run.
Re:Its a start, but only a start (Score:2)
yes but 1 compared to 100 is 10^2. We're talking about advancing technology exponentially.
I don't think it will be long before these are useful. A digital camera for example would be able to make use of memory chips with that existing technology. 300 Picture Packs of Ram -- once a process is introduced and prices come down, it may not even be necessary to improve the technology - though I'm sure it will be.
meaning? (Score:1)
Don't get your hopes up (Score:3)
-- Moondog
Re:They don't really state "hundereds" is the limi (Score:1)
That's unnecessary (Score:1)
Disk access? (Score:1)
Re:Humans are the most advanced species progressin (Score:1)
Mirror (Score:2)
Re:They don't really state "hundereds" is the limi (Score:2)
prick. (Score:1)
Re:Is this really going to work? (Score:1)
Re:Its a start, but only a start (Score:2)
Actually, if the memory is fast enough, you could use individual memory bits as logic gates quite easily - which means that the only thing for a computer which WOULDN'T be memory, will be the connections inside the chip & the connections to the external world.
The really real best computer (Score:1)
Re:Not a problem for me (Score:2)
Re:Hasn't this been done (Score:1)
rLowe
Esperutus (Score:1)
Re:How can that be a good thing? (Score:1)
Change my oil? No thanks.
--
Obvious typo (Score:2)
Re:Disk access? (Score:2)
Re:Vacuum tube computers were for the super rich (Score:1)
Why don't we concentrate on actually making the most of our current software/hardware before we decide to make bigger and better stuff first
By your logic researchers should never have tried to find any tech better than vacuum tubes, in which case you'd never be able to post on Slashdot. You say vacuum tube computers were for the super rich... not really true, they were very complex and not capable of being mass-manufactured at the time (each computer of the time being the size of a small house) and consumed literally hundreds of megawatts of electricity, so they were only for the institutions that felt they held enough promise of usefulness in the future. They were initially purely for research (research into computing science mind you, not research into other things utilizing computers). You should always have visionaries looking into the future and dreaming up extremely long-term future tech whilst others are working on gradual improvements to current tech. Without such a balance society would never have looked into gasoline (internal combustion) engines, maybe we would have just constantly tweaked and refined the horse-and-buggy setup! And definitely never would have invented airplanes or dirigibles - why not perfect the automobile before worrying about such nonsense as contraptions that can travel through the air?
In theory what we have now is good enough. You don't see rocket powered cars or personal helicopers in mass production and in wide use do you?
Ignoring the fact that the two technologies you mention are even less practical than our current automobile-based existence, I personally do need more powerful hardware if I'm ever going to see a robotic butler/maid in my house while I'm off travelling to some far-away cosmos! If you are happy eating your raw meat and grains and bathing in cold water, fine, but I'm still going to use the Plasma-Stove(tm).
And how do you know that the interface of the future isn't going to be one that comes as a result of new hardware tech that some engineer dreams up? I can envision a helmet-like device that reads neural activity in my brain (patent pending in too many countries to list - IOW piss off I thought it up first) and types stuff on the page so I don't have to sit here for 5 mins and respond to your post, correcting all my damned typos! But that interface is dependent on some hardware whose technology isn't available yet. And maybe we need smaller, more powerful processors to embed into the helmet to decode all of the neural activity in my head (believe me it's a hotbed of synaptic storms)... you getting the picture?
----
Its a start, but only a start (Score:2)
As those of who work with computers know, memory is only part of a computer. Yes, an important part, but only a part. As the article points out, getting these molecular switches to work is only part of memory for that matter. Before we can will be seeing NanoRam (NRAM)TM on the store shelves, they have to figure out how to wire lots of these little beasties together. But it would be nice to have Tera-bytes of power in a quarter sized computer.
Another problem I see with this early version of memory is that the individual cells can be switched "hundreds of times". Better than once, yes, but a realistic memory cell needs to switch billions, if not trillions of times during its expected life. We have a way to go here.
Gonzo
Re:hmmm... (Score:3)
--
rickf@transpect.SPAM-B-GONE.net (remove the SPAM-B-GONE bit)
They don't really state "hundereds" is the limit. (Score:3)
The author of the article wrote:
This doesn't necessarily mean that they stop working after that point. It's reasonable to assume that a molecular device that can perform a repetative function can continue to repeat that function.
If I were to say that I've rebooting my computer hundereds of times, you wouldn't assume that I can no longer boot it.
Also, it helps to pay closer heed to the quoted source than the authors text. The author quoted:
Molecules don't break down from wear in quite the same manner as larger scale components. Assuming that such switches are properly housed, there lifespan would be affected by things like changing environmental conditions (EM, temp, etc.) but not repetative use.
The part I objected to is the following:
This just doesn't ring true. First, they are far from actually developing a "molecular computer". How could they be working on one that does anything, let alone a learning one.
This sounds like hype... call it "text-candy" for lack of a better term. The principles for bullding a computer are not really changing here (switches, logic gates, etc.) so developing a "learning molecular computer" has two steps:
They are currently on step one....
Re:Is this really going to work? (Score:3)
Ok, you begged the response:
Only if you were running windows/linux/freebsd/dos/os2.... </joke>
---
Hh... (Score:1)
--
Kiro
It's happening already (Score:3)
The closer they got was "nano-sized computers" and an institute called nanosystems.
I don't think this is coincidence. After a lot of media coverage of stories like Bill Joy's paper [wired.com] (which i read and liked very much, but didn't totally agree with) the term nanotechnology has been made to have a negative ring to it.
then you have all the nonesense far less inteligent people have said, plus some very cool renditions like Deus Ex, and what you have is instant holocaust. soon nanotechnology will sound as bad as nuclear reactors and genetic engineering to a lot of people (they all sound fine to me).
anyhow, congrats to the research team.
Re:Is this really going to work? (Score:3)
Well of course we do. As things get smaller and smaller it's going to get to individual atoms and molecules anyways.
Even if the current silicone transistor technology, can be scaled much further down. It will still take up a lot more molecules than this (possibly in the millions), simply because it's a technology that is not design to work with individual molecules.
Vacuum tube computers were for the super rich (Score:1)
Or standardize on some type of interface for the future and then work on getting stuff to fit into those interfaces?
Oh and in theory because of things like mass distribution it's supposed to drive price down and the like. I don't see my computer that is as ubiquetous as say a bicycle or something unless it's crippled or is used.
Why do you need hardware to constantly increase in power anyway? In theory what we have now is good enough. You don't see rocket powered cars or personal helicopers in mass production and in wide use do you?
The need? (Score:1)
You no that most of the actual things that people use their computers for could be fulfilled with QNX and a 486 right?
Get rid of the bloat and the rest will follow. People should ask themselves critically " DO I really need to be a pawn in a rich man's game?".
Not a problem for me (Score:1)
Also I think that most modern hardware is really trash compared to the old stuff I thave (seriously). Take my comptuers all made from components that were in the 386/486 class range and not one hardware failure in 3-5 years and they were used! Now I hear about people ruining their motherboards by messing around with them, having constant hard disk failures, memory parity errors, dieing ram, etc.
My computer has been exposed to temps within my house I was sure were above 110 and my machine didn't crash or fail in any way (running Debian and MS-DOS). Not a problem in sight. I don't have to dip my computer in liquid nitrogen or anything. In fact I have never actually seen a machine crash because of overheating in any way before.
Cooling is the least of my worries as far as my computer.
Re:Is this really going to work? (Score:1)
This is probably a troll but what the heck, right?
Moore's Law has a physical limit, eventually (if we want to keep it going) we've got to move the essential elements of computers to the smallest levels possible. We'll need molecular computing when when current silicon-based techniques hit their limit.
By your logic we should never have moved away from vacuum tubes.
Doubtful (Score:1)
The chances that anyone will be able to do anything like that are at least 9.4562334x10^135 to one and that's generous.
Hell we havn't even landed a man on mars, created a permanent moon base, developed superinteligent cyborgs or done virtually any of the things that scifi pundits have claimed should have been done.
Let Picard and the boys worry about the evil nanites for now and when we get there (maybe around the 45th century) we'll have a new look ok?
Confusing... (Score:1)
Was anyone else confused by this?
Oh goodie! (Score:1)
Re:They don't really state "hundereds" is the limi (Score:1)
Re:Is this really going to work? (Score:1)
The reference to working at room temperature refers to the fact that this stuff was previously [rice.edu] only demonstrated at very cold temperatures (60 K or -350F). This is a significant achievement towards getting it stable at even higher temperatures. Not quite unlike not so long ago when computers had to be placed in air-conditioned rooms.
given how fast conventional processors are accelerating (Moore's Law and all), do we even need molecular computing?
Moore's law says nothing about the actual technology used to achieve the gains. This might very well be what we need in order to keep with Moore's law once the current theoretical limits on conventional technology are neared.
Britney Spears Guide to Semiconductors (Score:1)
This is NOT a joke! Click here: http://britneyspears.ac/lasers.htm [britneyspears.ac]
Re:Is this really going to work? (Score:1)
How is this any different from conventional technology? If you increase the tempurature of a CPU sufficiently, it will begin to malfunction or not even function at all (this is often encountered by overclockers). Molecular computers might be more vulnerable to this, but I don't think it's that big of a deal. Just get a cooling/heating system which always adjusts the equipment to the correct temps, or simply turns everything off if it is too hot or too cold.
Orville and Wilbur Wright didn't need Phds (Score:1)
Now getting that new invention takes trillions and a research lab full of people who are all extremely over educated.
And by the way it's called "inventor" instead of "researcher" because inventing can be done by the average man and "research" takes a hell of a lot more.
So actually employing logic we can see that this is in fact a waste of time.
1. The longer something takes the more likely that failure will occur.
a. exaustive checking usually means that the object of study isn't something that can easily be analyzed and therefore less likely to be found.
2. Multiple doing a process proves it even faster. a. taking 1 and 1a we see that therefore this is less likely to be found.
3. Computing power is purely sufficient now to take care of current noncommercial needs or that of people who have been shall we say "convinced" (or more properly coerced) to buy crap.
4. Better more efficient software/hardware will result from continued development on current gen stuff.
5. In conclusion this is a waste of time.
Re:Hasn't this been done (Score:1)
In addition, I/O will be a bitch.
Re:Orville and Wilbur Wright didn't need Phds (Score:3)
Do we have the processing power to do this? Yes. Could I fit in my pocket? No. That is why this is important - every step towards nano-scale technology is a leap forward towards a future of prevelant, pervasive technology that is much more flexible than what is currently available.
THAT is why this interests me.
Long time (Score:1)
Re:Dangerous (Score:1)
Sorry, couldnt resist.
War! (Score:3)
Re:hmmm... (Score:1)
Hasn't this been done (Score:4)
I have this theory that with nano, EE/CS will become in less demand, and mechanical engineers will be forced to reexamine rod logic if they want the good jobs. However, we're really moving at a snail's pace here, and haven't had any real developments in a while. nano-saxaphones [mindwire.org] for Pres. Clinton! ;-)
Re:Its a start, but only a start (Score:2)
Another problem I see with this early version of memory is that the individual cells can be switched "hundreds of times". Better than once, yes, but a realistic memory cell needs to switch billions, if not trillions of times during its expected life. We have a way to go here.
True: that's also the reason we don't use flash-memory as RAM, it becomes unreliable after that many switches but would give the advantage of not needing to be refreshed. (would take away the need to boot your computer)
But maybe that's old thinking: what if the individual cells become so abundant you just discard them and use others after a hundred switches? These are molecules, their production is not comparable to present-day RAM-transistors. A new "memory-paradigm" might be necessary: other ways of organizing and working with memory. I don't think the RAM/HD config would still be relevant with this technology.
Anyway, another good article on molecular electronics (not only about memory though) from Wired:
http://www.wired.com/wired/archive/8.07/moletronic s_pr.html
hmmm... (Score:4)
(Yes, I'll take a couple of those 6.0225x10^23 SIMMs, please...)
--
rickf@transpect.SPAM-B-GONE.net (remove the SPAM-B-GONE bit)
But it's one of the parts that needs the most work (Score:5)
I agree with you, I just want to expand on something:
And it's the part that can benefit the most from this kind of thing. Not for size, but for speed. The speed of memory access is often the controlling factor in how fast a program runs. It doesn't matter if you just bought a 6000 terahertz CPU; if your memory is slow, your processor only waits more.
Currently CPUs are on the order of ten times as fast as the processor, and the gap is increasing. We need faster memory more than we need faster anything else. (Well, maybe faster pizza delivery.)
Is this really going to work? (Score:3)
There just seem to be a whole lot of risks involved with molecular computing. Wouldn't it be easy for the molecules to be jarred out of position? If a part of the computer breaks, would all the molecules inside be lost? This kind of research sounds like a waste of time -- given how fast conventional processors are accelerating (Moore's Law and all), do we even need molecular computing?