Slashdot is powered by your submissions, so send in your scoop


Forgot your password?
Australia Science

Researchers Create 4nm Transistor With Seven Atoms 120

EmagGeek writes "University researchers have created a transistor by replacing just seven atoms of silicon with phosphorous. The seven-atom transistor has hopeful implications for the future of quantum cryptography, nuclear and weather modeling, and other applications. 'The significance of this achievement is that we are not just moving atoms around or looking at them through a microscope,' says Professor Michelle Simmons, a co-author of a paper on the subject that is being published by Nature Nanotechnology. The paper is entitled 'Spectroscopy of Few-Electron Single-Crystal Silicon Quantum Dots'."
This discussion has been archived. No new comments can be posted.

Researchers Create 4nm Transistor With Seven Atoms

Comments Filter:
  • by ( 1563557 ) on Wednesday May 26, 2010 @11:23AM (#32349076)

    It'll take a really wicked manufacturing process to ever make, too. 7 atoms? What if you get only 6? What if you get 8? What if one is slightly off position?

    Building a car with 4 wheels? What if you only get 3? What if you get 5? What if one is slightly off position?

    An automated process doesn't care about size. What they did, can be replicated. Thus, it can be automated, unless there's a creative process involved that implies the use of a human mind, which I strongly doubt.

    If the automation is too slow, it can be multiplied. If multiplying is still not enough, the process itself of creating and assembling multiple automatons can be multiplied.

    Price vs usefulness of the final product may well be a problem, but size isn't. It was until it was solved, which is precisely the point of the news.

    In macroscopic terms the world is simple. The finer the resolution the more complex the world gets. In nanoscopic terms the world is complicated.

    Our current technology allows us to automate macroscopic processes with high precision. Nanotechnology however is one leading edge technology, and as such the precision certainly isn't there to make a fair comparison to automated macroscopic processes.

    Think of a doctor performing surgery: a large benign tumor in section of fat could be easily removed, while a miniscule brain tumor would probably be one of the most difficult to remove.

  • by Bakkster ( 1529253 ) <[ ] ['mai' in gap]> on Wednesday May 26, 2010 @11:50AM (#32349400)

    "Off" is almost never zero current. There's usually just a tiny amount of 'leakage' current, although some quantum designs (such as this one seems to be) can have exactly no current while off.

    Basically, while all our computers and data are binary, they operate in an analog environment. We just treat any value greater than (for example) analog 0.8 as a digital 1, and anything less than analog 0.2 as a digital 0. The problem has been as we shrink the gate size and thickness and reduce supply voltage in order to get faster, we also increase this leakage current.

    One of the things keeping us from getting smaller faster is that without handling this well, we could have the issue where the 'off' current was more than 50% of the 'on' current, sometimes significantly more. It's still technically a transistor, but it's not practical if you're trying to determine between 8uA for 'on' and 7uA for 'off'. What GP is asking is whether this is a practical transistor (the output currents are different enough that it could be used to toggle the gate of another equivalent transistor), or just a theoretical 'acts like a transistor, but has no use'.

    Read this [] for a bit more info.

  • by Anonymous Coward on Wednesday May 26, 2010 @11:58AM (#32349494)

    The problem is that even at current sizes, we experience a large amount of process variation, which is basically deviation of the actual device sizes from the ones you specified due to it being so damn hard to make something that small:

    Process variation is becoming one of the biggest problems as chips shrink, because the variation in transistor sizes means that every circuit has to be designed with some amount of safety buffer, which increases as the amount of process variation increases. This can be remedied by improvements in fabrication techniques and by alterations to circuit design, but it seems highly likely that it will remain a problem for a long time, especially at smaller process nodes.

  • by tool462 ( 677306 ) on Wednesday May 26, 2010 @12:04PM (#32349578)

    You've got the theory basically correct, but in the real world the "off" current is just less current, not zero current. To get a good signal to noise ratio, you want Ion / Ioff to be as big as possible. In older processes (or thick oxide devices) you can get really good ratios. You could have an Ion of 10mA and an Ioff of 10nA, for example, for a ratio of 1e6. For newer process nodes on thin oxide devices, that ratio may get as low as 1e3 or worse. In that range, the device still works well for digital circuitry, but speed comes at the expense of very high leakage power. As that ratio gets even lower, you end up with a device that isn't suitable for digital circuitry--you can't tell the difference between an on and an off device reliably.

  • by Anonymous Coward on Wednesday May 26, 2010 @12:11PM (#32349668)

    This isn't insightful. The analogy drawn is invalid on many levels.

    The argument isn't that a quantum computer isn't necessary. It's pointing out the fact that computing is often I/O limited - how fast a computer can move data around to be processed. He's saying that advances need to come in other areas before things like this are significant.

    And, as someone with a background in these things - you don't make a good transistor with 7 phosphorus atoms. There has to be more to it. The fact that they created a transistor on the order of several atoms isn't exciting either, IBM worked out how to move around atom by atom a long time ago. With the right equipment, this has always been theoretically possible, but practically, retarded.

  • by Bakkster ( 1529253 ) <[ ] ['mai' in gap]> on Wednesday May 26, 2010 @12:20PM (#32349796)

    Doping really isn't relevent here, since we're not talking CMOS or FET transistors. While it's still a transistor operationally, the structure is completely different, so there is no p- or n-type material, per-se.

    What this is, is a quantum dot [] which acts as a single electron transistor []. It's as different from a CMOS transistor as CMOS is from a vaccuum tube. So, asking for a doping ratio of a quantum dot transistor is like asking for the grid spacing of a CMOS, or the oxide thickness of a JFET: it doesn't exist.

  • by bain itic ( 1593851 ) on Wednesday May 26, 2010 @12:52PM (#32350220)

    Even in a crystaline structure? Forgive me, IANAMS.

    Yes, even in a crystalline structure. Diffusion in solids at the macroscopic scale seems slow compared to say, cream in your coffee, but at the atomic scale... They did this at the surface, which makes it even worse. I can't imagine this lasting any useful amount of time without some SEVERE cooling measures. I'm not sure if even liquid nitrogen could save it. IAAMS.

There are two kinds of egotists: 1) Those who admit it 2) The rest of us