Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Science Technology

How Small Can Computers Get? Computing in a Molecule 143

ScienceDaily on what the future might bring for atomic-scale computing: "Joachim, the head of the CEMES Nanoscience and Picotechnology Group (GNS), is currently coordinating a team of researchers from 15 academic and industrial research institutes in Europe whose groundbreaking work on developing a molecular replacement for transistors has brought the vision of atomic-scale computing a step closer to reality. Their efforts, a continuation of work that began in the 1990s, are today being funded by the European Union in the Pico-Inside project. ... The team has managed to design a simple logic gate with 30 atoms that perform the same task as 14 transistors, while also exploring the architecture, technology, and chemistry needed to achieve computing inside a single molecule and to interconnect molecules."
This discussion has been archived. No new comments can be posted.

How Small Can Computers Get? Computing in a Molecule

Comments Filter:
  • by wjh31 ( 1372867 ) on Tuesday December 30, 2008 @05:50AM (#26266579) Homepage
    rather than the whole computer, i see no reason why consumer computers need ever get any smaller than a phone if you want it portable, or small enough to be fitted to the back of a screen for desktops
    • by Mozk ( 844858 ) on Tuesday December 30, 2008 @05:53AM (#26266591)

      Smaller transistors means more efficient transistors. It's not just about size.

      • by Hylk0r ( 1295086 ) on Tuesday December 30, 2008 @07:31AM (#26266947)
        Not only that, but it also means you can have millions of (parallel) processors on a tiny chip, which results into more performance.
        • You seem to think computers operate using a combination of "processors" and "magic". You are mistaken.

          • You can't really quote "magic" if he didn't say it.
          • Re: (Score:3, Funny)

            by IceCreamGuy ( 904648 )

            You seem to think computers operate using a combination of "processors" and "magic". You are mistaken.

            You just rocked my world view.

          • Re: (Score:1, Funny)

            by Anonymous Coward
            So they just use magic then?
          • You seem to think computers operate using a combination of "processors" and "magic". You are mistaken.

            NOT LISTENING TO YOU!!! The magic smoke is REAL!!! It's real magic!!! And love, the secret ingredient is love, damnit!!!

            • And love, the secret ingredient is love, damnit!!!

              Funny, my computers work just fine. <Kicks cat&$@#*% NO CARRIER

          • by genner ( 694963 )

            You seem to think computers operate using a combination of "processors" and "magic". You are mistaken.

            No I think a video card goes in there somewhere.

      • by renoX ( 11677 )

        >Smaller transistors means more efficient transistors.

        Not always.
        Those molecule sized transistors are much more efficient yes, but this rule isn't true anymore for the transistors we have in our current CPUs..

        That's why now the frequency that our CPU doesn't increase anymore, it used to be to each generation of transistors were smaller and were more efficient so you could make the CPU run faster, but as this stopped the increase of frequency of our CPU has stopped also.

        • That's why now the frequency that our CPU doesn't increase anymore,

          Efficiency != speed. While speed has remained stagnant over the past several years, power consumption and heat dissipation have gone down or remained the same while computations/second have still steadily gone up, which means higher efficiency as less energy is used to do the same amount of work. If you're still measuring the quality of a processor by its speed, then you have a lot of catching up to do since 2003 when the GHz race ceased to be relevant.

        • Re: (Score:1, Informative)

          by Anonymous Coward

          The problem is synchronization of the electronics. When you reach a certain speed the speed of electron flow comes into play. Likewise the shape of the ramp than denotes the hi / low states becomes an issue. When you 'scope what you believe to be a square wave you'll see that the edge is not the 90 degree edge one tends to think these waveforms are.

      • Smaller transistors means more efficient transistors. It's not just about size.

        It also means noisy circuits.

      • Smaller transistors means more efficient transistors. It's not just about size.

        That's not what SHE said.
    • Claytronics for example need further miniaturization. Tiny dust specs that communicate with other specs to barbapappa-build any product you'll ever need.
    • In the 1960s (Score:5, Interesting)

      by Kupfernigk ( 1190345 ) on Tuesday December 30, 2008 @06:10AM (#26266639)
      There was, as I recall, a TV programme in the UK called "Tomorrow's World" in which the presenter once prophetically ridiculed the idea of handheld computers. After all, what could you possibly use them for?

      Combine this kind of idea with recent research on PNA (a more robust molecule than DNA which shares many of the properties) and the long term prospects could be very interesting - self-assembling memory, for instance.

      • But what would one do with a handheld that's smaller than today's? Seriously a lot of those devices have already hit the point where they're difficult for some of us to use due to the tiny buttons. I'm not sure that shrinking the size much more is going to make any sense for most people. Sure you can change the interface possibly to voice activation or direct neural interface, but tiny electronics are easy to lose.

        I'm sure there's reasons why shrinking the electronics within the shell is desirable, but smal

        • You need to turn in your geek card.

          Computer small enough to process audio inserted inside the ear. Reprogramble Bluetooth headset that sits in between your skull and skin. Medical diagonstic equipment that you swallow or is inserted into the blood stream by injection.Light weight glasses with mp3 playback(yes they existnow but only In bulky frames)

          Of all of those a subdermal reprogramble(by light flashes like the old MSFT watch) would be awesome.

        • Re:In the 1960s (Score:5, Insightful)

          by Elladan ( 17598 ) on Tuesday December 30, 2008 @01:03PM (#26269521)

          Broaden your vision. This is about making smaller components.

          What can you do with smaller components? Well, right away, you can put more stuff in the case. Your iphonanopalmtop thing can have a foldout screen and keyboard, or a bigger battery, or it can simply be lighter. I don't know about you, but I find an iPhone a bit hefty.

          Now, if you look beyond next week, smaller components let you do entirely new things. You think technology is sufficient now to put a computer in a palmtop? Whatever, dude.

          I want a computer in my eyeglasses. Optically corrected screens overlaying my vision. High resolution. And I want them to weigh the same as a normal pair of glasses. Don't forget to throw in a video camera for good measure.

          Can we build something like that now? Or course not. That sort of thing today is either a huge bulky piece of headgear, or it's moderately bulky and has a terrible display. We need better components: much smaller, much lower power, faster.

          Don't ever say we've reached the limits of useful computer technology. Until you're plugged in directly via your visual cortex and have a robot butler who brings you waffles in the morning, we haven't even reached the limits of uses we can already imagine.

          • I want a computer in my eyeglasses
             
            Also streaming pr0n

          • i'll just settle for getting the flying cars they've been promising for the last 50 years. or just a direct brain interface to a standard PC/mac and a few applications that can take advantage of it.

            imagination's nice and all, but frequently allows the marketing droids to lead us around by the nose.

          • I don't know about you, but I find an iPhone a bit hefty.

            I think you need to start workin' out, mate.

    • by locster ( 1140121 ) on Tuesday December 30, 2008 @06:45AM (#26266807)

      OK but what if you want to put them inside nanobots designed to target and kill cancer cells or a zillion other applications that are made possible by smaller and less power hungry computation? Smaller also means more powerful computers at the 'classic' scale, for which we know there is demand for right now by way of the very existence of supercomputers.

      • Smaller also means more powerful computers at the 'classic' scale

        That's a major understatement. Assuming we're working with carbon:

        (6*10^23 atoms / mole) * (1 mole / 12g) * (14 transistors / 30 atoms) * (1 core2duo / 1.5*10^8 transistors) = 155 trillion Core 2 Duo processors per gram.

        Even if there's a billion-to-one overhead for interconnects and so on, that's still 150,000 processors per gram. Yeah, that counts as more powerful.

    • There is whole bunch of reasons why would we want smaller transistor replacement. It's not just about "consumer computers".
    • Re: (Score:1, Redundant)

      by daem0n1x ( 748565 )
      And 640K of memory is enough for anyone...
    • rather than the whole computer, i see no reason why consumer computers need ever get any smaller than a phone if you want it portable, or small enough to be fitted to the back of a screen for desktops

      Why on earth would you want to interact with a computer via a screen? I want reality overlay, and some sort of neural interface (coupled with a gesture interface.) Ideally the computer that performed these functions would be no larger than a single chip to make implantation easier. Nasal cavity, mastoid, something (the latter is a little dangerous, in an impact it could become a serious health issue. But anyway.)

    • Re: (Score:3, Insightful)

      by jellomizer ( 103300 )

      Remember back in 1984 when the first Mac was released touted at a size of a stack of paper. Considered small and light enough to move anywhere. Then when the laptops (real Laptops, not the luggable) were released while today are considered huge and bulky but at the time they were small enough to carry with your books.
      As computing shrinks our idea of size goes down too. The Stack of paper Mac, was considered really small because computing at the time the easiest job for moving anything with computing was th

    • Have you thought of the potential saved cubicle space!? Slap a "pico" desktop box into a head mounted display, give the wage-slave a keyboard, and convert half your current cubicle space into those honeycomb sleep spaces (have 'em get in feet first so you can periodically come by and knock them on the head to make sure they're not dozing off)!! The other half of the cubicle space can be used by management to treat themselves for saving so much on power usage and reducing inefficient office banter. I'm think
    • by Genda ( 560240 )
      rather than the whole computer, i see no reason why consumer computers need ever get any smaller than a phone if you want it portable, or small enough to be fitted to the back of a screen for desktops

      Thomas J. Watson, the founder of IBM, claims he said he thought there would never be a need for more than 5 computers in the United States. We laugh at this today, because it was impossible for Tom to see what ubiquitous computing might make possible. Just as Tom was blind to the future, so you are blind. Ima

  • by Mozk ( 844858 )

    How about some actual data? This article is extremely watered-down ("1/100 of a nanometre (that is one hundred millionth of a millimetre!)") and essentially has nothing beside speculation about what these transistors can be used for. They don't even say what element the atoms are for fuck's sake. It's pretty amazing that they made the equivalent of 14 transistors with 30 atoms, but the article makes it sound like they just pushed some atoms together under a microscope.

    • I didn't actually RTFA (yet), but they probably did just push atoms together under a microscope (although doing so would require specialized equipment which is more than simply a "microscope"). The damn thing will probably fly apart once it gets above more than few tens of degrees absolute. It's great if they can put a P4 chip on the head of a pin and not need a huge heatsink for it - but it's useless if you need to carry around a couple of thousand pounds worth of cooling equipment to use it.

      (Having read

    • Yes! Would it kill the online science magazines to post a link to the actual research web site or a technical report? Geez, I'm getting tired of having to search through multiple websites starting with the researcher and/or department name to find the actual research because the Google search is swamped by the press release that's being flogged. All of this could be avoided if the damn press release would contain a link to the research page and the science mags would include this link.

  • by Finallyjoined!!! ( 1158431 ) on Tuesday December 30, 2008 @05:52AM (#26266589)
    Is the wrong question I think. The size of the "computer" is really dictated by the interface. It would be great to have a computer the size of a halfpenny, but how would you access it?
    • by hdima ( 259063 )

      Not a problem. You just need to interconnect such computers to create data-center inside of your phone.

    • Re: (Score:3, Insightful)

      That's only if a human needs to interface with it directly. If the tiny computer had networking capabilities, you could access it through that. How about a pre-programmed computer that collect data from their surroundings? They could be injected into a person's blood stream for health monitoring, spread around the worlds oceans, and even dispersed in the atmosphere. And that's just one direction that you could go with this. Don't limit your thinking to the computer that you're sitting in front of.
    • That will only be a problem until the invention of a telepathic interface.

      How small can a computer get if it is implanted into your brain?

    • Brain interface. Put the half-penny computer inside your skull and you just got yourself a co-processor :D
    • It would be great to have a computer the size of a halfpenny

      Uh, you mean, like the CPU in the computer that you're using right now?

      • A CPU doesnt make a computer, a CPU is a computer in need of a power supply, memory, clock source and BIOS. Fit that all on a penny and call me back.

        • Re: (Score:3, Informative)

          by IceCreamGuy ( 904648 )
          So how would transistors and gates the size of atoms help, in any way, to fit a computer onto a penny? Are we actually talking about the article here or are we talking about some imaginary full computer that is the size of a penny and has nothing to do with the article? We already have full computers: memory, BIOS, clock, and RAM that are the size of a penny (power supply is a long way off, though), they may not run Windows, but I mean you can go buy a microcontroller that has all of those basic functions f
    • You can always do what we do all the time: make it the size of the current processor and say that it is a million times faster.

    • by dov_0 ( 1438253 )

      but how would you access it?

      Just what I was thinking. Obviously other technologies would have to be developed alongside this processor/memory/whatever form it eventually possibly takes.

      The day is coming though when we can take a piece of 'paper' from our pocket and the paper is the computer. The possibilities in regards to pervasive computing are interesting.

      The true question isn't about when they can make a full usable computer using this technology. The real show-stopper is when they can make it affordable and market it in a world

    • There are these things called radio waves. They're used to transmit and receive data without the need for wires. You should try 'em, I've heard they're all the rage! I've even heard they can make the receivers/transmitters really small. [everythingusb.com] Personally, I'm pretty sure this a just a fad.
      STOP
      I'll stick with my telegraph, thank you very much.
      STOP
      • You'd still need a USB port, last I looked even a micro-B receptacle wouldn't fit in my ha'penny....:-)
        • One of the craziest related fads I've heard of is "integrating" [electronicdesign.com] these "transceivers". As if anyone would want something as bleeding edge as "built-in" bluetooth capability... psshhhaw!
          STOP
          I think those crazy radio-wave-transceiver early-adopters will find themselves only the object of more ridicule once this little fad burns out.
          STOP
    • You wouldn't have a computer the size of a halfpenny. You'd have a computer the size of a cell-phone, with the processing power of a 32-node Beowulf cluster.

      If you have one of those, you have exactly zero use for the cloud: so long as you have a monitor, keyboard, and mouse, your computer can be exactly where you are.

      Obviously, you want some backup, but that is easily solved by have two or three of these things, one of which sits on your desk, and you sync to it when you're at home, so if you lose your port

    • Its the size of its two controls, a clip, and phone jack. Fortunately a battery, flsah memory, and basc computer fits inside the same form factor.
    • you could hook up a reasonable sized touch screen to it? the whole unit could be very light.
  • by Anonymous Coward

    Really the question is how small becomes impractical? I remember the calculator cold wars. It hit the limit when everyone realized how silly a pen one was when you couldn't read the display or use it without a tiny stylus. Eventually the cost of reducing the size will be astronomical so even if you can what's the point? We can make antimatter fuel it's just so insanely expensive that without a major technical leap you aren't going to be powering a car much less a starship with it. There may be uses justifyi

    • Re: (Score:2, Insightful)

      Nanocomputers are very practical. Consider only the applications in biotechnology - computers that tiny would allow for everything from intelligent nanobots to do cellular level maintenance to a nanobot conglomerate that could actually replace failing cells - even complicated ones like neurons. And if you can create a neuron, you can create a nervous system. And if you can create a brain... I'm oversimplifying it, but you get the idea. We're not strictly talking about biotechnology anymore, are we?
      • by dov_0 ( 1438253 )

        Create cells? We've got to understand them first! We'll all be having nano-ipods implanted in our skulls before we even come close to being able to engineer one human cell.

        • replace failing cells

          Create cells?

          I took it to mean the dead cell is discarded and the nanobot moves in to take it's place. Rereading his post, it seems he did mean creating cells, but that might not be necessary. We just need a nanobot that can serve as a proper replacement.

    • We can make antimatter fuel it's just so insanely expensive that without a major technical leap you aren't going to be powering a car much less a starship with it

      We can? Link to article?
    • by Blublu ( 647618 )
      "There may be uses justifying continuing to reduce computer size but already they are about as fast as people need for most apps so the biggest benefits would be power useage and cramming more computers into places they don't belong." There's so much wrong with that sentence, I don't even know where to begin.
  • halfway there? (Score:3, Insightful)

    by wisebabo ( 638845 ) on Tuesday December 30, 2008 @05:59AM (#26266611) Journal

    The real key to all of these and all non-trivial efforts at Nano technology is for these devices to be self assembling. By non-trivial I mean other than "simple" things like nanotubes or quantum dots. These simple compounds can now be produced in industrial quantities through basically chemical/physical means.

    While it is very very impressive that they can do this, in order for this to become practical, they will have to make millions, no billions, no trillions, no quadrillions... of these things at once or they have to be able to duplicate/reproduce themselves. The (self) "assembler" is, of course, the holy grail of nano-tech.

    Hope I see it before I die and that it doesn't cause my (and all of our) deaths! :P

    • Assuming such a self-assembler was created, I would expect it to require some very controlled situation for reproduction - extremely high temperatures, or extremely cold ones, or possibly a flow moving from 400 Kelvin, followed by a rapid cooling to 150 K, then a move to room temperature, at which point the process could be repeated, given enough materials to produce more.

      Such materials would probably just be raw Carbon, hydrogen, and maybe a few other obvious elements, but I wouldn't be surprised if you ne

  • An uncle of mine works as a deputy mayor for the local government here, and was quite pleased to find out that everybody, including him, that worked in the council would be given a free new laptop. Naturally, he imagined that it would be smaller and faster - alas, now his job is far harder as he has to lug around a big heavy slow piece of shite.

    Meanwhile, atomic-scale computing is created, and at this stage its hard to say whether this is a step or a giant leap in the right direction.
  • Soo... (Score:4, Interesting)

    by Subverted ( 1436551 ) on Tuesday December 30, 2008 @06:22AM (#26266707) Homepage
    30 atoms doing the work of 14 transistors... Does this mean that the amount of transistors(logic gates) able to be fit on a chip is now more than exponentially larger? Of course, depending on how easy this would be to adapt to commercial production(and get them talking to eachother) might it be the plateau that Moore's law predicts?
    • > is now more than exponentially larger?

      It's impossible to tell if it's scaling linearly or exponentially or whatever from just one data point; however, unless the atoms are working in a totally different computing paradigm (like quantum computing), it's unlikely to be more than just a linear factor of improvement.

    • might it be the plateau that Moore's law predicts?

      Yes:

      In terms of size [of transistor] you can see that we're approaching the size of atoms which is a fundamental barrier, but it'll be two or three generations before we get that farâ"but that's as far out as we've ever been able to see. We have another 10 to 20 years before we reach a fundamental limit. By then they'll be able to make bigger chips and have transistor budgets in the billions.

      -Gordon Moore, 2005 http://en.wikipedia.org/wiki/Moore%27s_law#Ultimate_limits_of_the_law [wikipedia.org]

  • I prefer to think in Nanotransistors->far more powerful not-nanoscale computers, at least in mid-term. Molecule sized memory, in example could be a big hit in all areas if an efficient cheap way of production is reached (genetic engineered cows that gives milk of memory? bacterias?).

    But about complete computers, well, still dont know if all components could be stick together in a single molecule, or that it retains all the components functionality in that way.
  • Damn (Score:2, Funny)

    by zoomshorts ( 137587 )

    Sometimes I misplace my laptop. How will if find my tiny computer
    in the future? Will I wash it by mistake? Can it take the dry cycle?

    Grrrrr.

  • Wow (Score:5, Funny)

    by Drakkenmensch ( 1255800 ) on Tuesday December 30, 2008 @07:52AM (#26267055)
    And you thought laptop screws were hard to find when you drop them on the living room carpet...
  • But what will you do for I/O, then?

  • by Alain Williams ( 2972 ) <addw@phcomp.co.uk> on Tuesday December 30, 2008 @08:22AM (#26267219) Homepage
    A bit of radiation whizzing by would not just 'flip a bit' and make the computer/program crash (or even worse - produce an erronious result) but could dislodge a few atoms and physically damage the computer.

    So are we going to have to shield tiny computers with an inch of lead ?

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Doesn't normal RAM get hit by cosmic rays and radiation? AFAIK it also suffers from bits being flipped incorrectly. Even Flash memory suffers from individual cells dying without much problem.

      I am sure there are ways to offer redundancy and failover between molecules, eg. you could create self assembling groups which all do the same calculation, a controller could then decide which ones are right based on probability, dead molecules could be marked in the FAT... err I mean the MAT.

      Most of the technology we

    • Re: (Score:3, Funny)

      by loafula ( 1080631 )
      No, we won't need to shield them with lead. The 6 foot thick heat sinks should suffice.
    • Re: (Score:1, Insightful)

      by Anonymous Coward

      A bit of radiation whizzing by would not just 'flip a bit' and make the computer/program crash (or even worse - produce an erronious result) but could dislodge a few atoms and physically damage the computer.

      So are we going to have to shield tiny computers with an inch of lead ?

      Instead they will be like modern computers with fault detection, error correction and automatic rerouting.

      • I wonder how much complexity would be added by the fault detection and repair logic to offset the advantage of this kind of die shrink. I take it atom chips would quickly deteriorate in speed as more and more cores fail?

        I have a bunch of devices back from the 80's that are barely warm to the touch and will work forever. I can't say the same about even a conservatively designed modern computer, let alone all the hand-held devices that are engineered to die from exhausted polymer batteries after a year.

    • Re: (Score:1, Insightful)

      by Anonymous Coward
      Just like biological systems, you'd use redundancy and self repair.
  • If the atoms get hit by some radiation the molecule should either break or (if hit whilst calculating) return a wrong value. So basically you'll have to cover your computer with 8 cm of lead, which istn't exactly in EU health standards.
  • ... that are facing computing. CPU speed is far out-stripping storage and memory bandwidth. More efficient transistors = nice, but LESS robust to defects = bad. I have to wonder how fragile these atom transistors will be. I'm wondering if we're approaching a point where having too few atoms leads to much higher failure rate.

    I can't be the only one thinking about how expensive this is going to be.

    • Reliability decreases with pow(number of atoms per transistor, 2/3), now when you apply some kind of error correcting, it increases with exp(number of transistors). If you maintain the chip size (that is, the number of atoms is constant), and the functionality, reliability will just increase with smaler transistors.

      You get reduced reliability only if you want those smaler transistors to do more than the big ones, and even then, when you want them to do a LOT more.

  • 30 atoms (Score:1, Funny)

    by Anonymous Coward

    ... ought to be enough for anybody...

  • Imagine a beowulf cluster of these! And a beowulf cluster of those clusters! Why, I could fit them on a matchstick head, call it W. Bush's brain!

  • Content-free article (Score:4, Informative)

    by autophile ( 640621 ) on Tuesday December 30, 2008 @10:50AM (#26268271)

    I much prefer to read Eric Drexler's PhD thesis, Molecular Machinery and Manufacturing with Applications to Computing [mit.edu]. Chapter 11 (nanomechanical computational systems) is particularly interesting.

  • Smart dust was supposed to be computers the size of glitter (square millimeter). Each would have a CPU, power access, and communications. These would be used for survellience and environmental monitoring. I recall labs simulating these with "domino-size" computers which can be constructed off the shelf.
  • ÐÑо ÐÐÐÐÐÑ! ---------- An optimist stays up to see the New Year in. A pessimist waits to make sure the old one leaves.

Nothing ever becomes real till it is experienced -- even a proverb is no proverb to you till your life has illustrated it. -- John Keats

Working...