Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Science Technology Hardware

Reinventing The Transistor For Molecular Computing 102

unnique writes "MIT's Technology Review, has an article on HP's research into finding a new way to make transistors smaller, and further stretching Moore's law." The article has some nice illustrations of the nano-componentry they're working on, too.
This discussion has been archived. No new comments can be posted.

Reinventing The Transistor For Molecular Computing

Comments Filter:
  • by JeanBaptiste ( 537955 ) on Wednesday September 03, 2003 @12:31AM (#6856420)
    and further stretching Moore's law


    its really more of an OBSERVATION than a LAW. a THEOREM at best. While it has held true through my short lifetime so far, it certainly does not qualify as a LAW.
    • by chgros ( 690878 ) <.charles-henri.g ... . .at. .m4x.org.> on Wednesday September 03, 2003 @12:46AM (#6856487) Homepage
      it's really more of an OBSERVATION than a LAW. a THEOREM at best.
      A theorem is better than a law ! It can't be wrong ! What could be better than a theorem ?
      "Moore's law" is a postulate perhaps, not a theorem (since it hasn't been proven)
    • by some guy I know ( 229718 ) on Wednesday September 03, 2003 @12:53AM (#6856522) Homepage
      it certainly does not qualify as a LAW.
      Yes, it is a law.
      Why do you think that Intel, IBM, et al are working so hard to continue to shrink their electronics?
      It's because of Moore's law.
      If they break Moore's law, they are facing some serious jail time.
    • by teamhasnoi ( 554944 ) * <teamhasnoi@CURIE ... minus physicist> on Wednesday September 03, 2003 @01:36AM (#6856666) Journal
      I'm afraid I MUST disagree.

      Gordon Moore made his famous observation in 1965, just four years after the first planar integrated circuit was discovered. This law was finally proven in 1989 with the release of the vernable 486(TM) DX processor from Intel.

      Due to incredible market forces and other mysterious occurences that remain unexplained to this time, chip speed doubled every two years. This remained true even through the infamous Intel factory shutdown in 1991.

      The plant was closed for a period of seventeen months due to widespread worker illness. The engineers at Intel had been under tremendous pressure to design a new chip that would double the speed of the impressive 486 DX. Sadly, the engineers were stumped. Adding to this incredible pressure was the unexplanable illness that spread about the facillity like wildfire. This illness would render an otherwise healthy person unconscious for a period of seventeen months. The afflicted person would then rise as if nothing had happened.

      Intel enginners were some of the last to be affected by this mysterious illness, and when it struck, there remained little choice but to shutter the plant.

      Seventeen months passed, and the lights of the Intel factory remained dim. Offerings by Cyrix and AMD began to overtake Intel's flagship 486 processor.

      Suddenly, the enginners began to regain unconsciousness one by one. Strangely, they all had a similar vision while under the illnesses grasp. They begain to call each other on the telephone, comparing notes on what they had 'seen'.

      Cautiously, they began to draw plans - plans that would save the great Intel from ruin.

      Work went quickly, as each enginner 'knew' what the others were thinking. Soon, the plant was reopened, and fabrication of of the new design began. The engineers collectively decided that the chip would be called the "Pentium". Asked a short time before his unseemly death, an enginner said, "It just HAD to be named that. I don't know why. But we all agreed."

      Sadly, the chip that propelled a limping Intel into the forefront of CPU technology was the last that any of the 'Pentium' designers saw to fruition.

      Tragedy struck the enginners as they were on their way to the company picnic. The bus that they were riding in plummeted off an embankment into a river, drowning all of them.

      Gordon Moore's famous 1965 observation was voted into law in 1994, one year after the release of the new chip. The punishment for violators is death by mysterious circumstance. No one has yet broken Moore's Law, and woe be unto those that do.

      Thanks,
      Jonathan Frakes

      P.S. In your ear, Mr. Smarty-pants.

      • I'm afraid I MUST disagree.

        No point fearing the inevitable, my friend.
      • by innosent ( 618233 ) <jmdority&gmail,com> on Wednesday September 03, 2003 @03:18AM (#6856909)
        Cute story, but there's one major problem (besides the fact that it's simply untrue)....

        Ok, I'm only going to say this one time, so don't forget it: Moore's law applies to the size of the gates, not the speed!
        For some reason, people seem to think that it applies to speed, but it is simply an observation on gate density. Gate speed has never followed Moore's observation for more than a very short period of time. The reason today's chips are so much faster is that (a) gate speed has increased due to more efficient designs and better materials, (b) gate density has increased roughly according to Moore's "law", and (c) die size has increased due to better manufacturing processes, since the better yields allow larger dies to be cost-effective.
        Moore's law is a great trend, but in reality it has nothing to do with speed increases, except that decreasing the size of a gate decreases propagation delays. The improvements in speed that have been made are more due to the number of transistors on a die, which have shot up due to (b) and (c), while each gate is faster due to (a), and only slightly (b). We have faster gates, on a bigger die, at a higher density.
        • Actually, Moore's "Law" does not refer to gate size. If anyone reads the actual history, Moore was referring to the number of components per die, since there has always been a trade-off between the complexity of the die and yield. While this may indeed lead to higher densities, density is, stricltly speaking, not what Moore was talking about. That said, Moore's Law is neither a law, nor very important.
      • I guess they should have called it the "Thunder Road"...although, from last year's commercials, it looks like at least they got to meet the aliens (these are all references to the movie "Explorers" for those who didn't note the similarities...)
    • its really more of an OBSERVATION than a LAW. a THEOREM at best. While it has held true through my short lifetime so far, it certainly does not qualify as a LAW.

      Always an interesting cultural weirdness, this hierarchy of "law" beats "theory" beats... I don't know.

      That's completely unknown in the rest of the world. Most of these words are just synonyms for each other, there's no official definition of what a "law" is. Sometimes part of a theory is named "Foo's Law", "Bar's Theory", or whatever, but those

    • Moore's Law seems as good as Hooke's Law to me.

      Hooke's Law for a spring: Force on a spring is proportional to the distance stretched from equilibrium. Until its stretched so far that the law doesn't work any more...

      • Until its stretched so far that the law doesn't work any more...

        That disclaimer, however, was not included in the original statement of Hooke's Law:

        Robert Hooke (1635-1703). The equivalent of this force law was originally announced by Hooke in 1676 in the form of a Latin cryptogram: CEIIINOSSSTTUV. Hooke later provided a translation: ut tensio sic vis [the stretch is proportional to the force].

        -- Marion & Thornton, "Classical Dynamics of
        Particles and Systems"

        I really feel like I'v



    • Mod me down if you want. If i cared that much I would have posted anonymously. But this needs to be read.

      Do I really have to read the stupid Moores Law isnt a law post every damn time someone mentiones moores law? Then asshole moderators mod it up to +5, even though it isnt interesting, and definately not informative. Ironically it is these same assholes who will probally mod this post down, even though it is more interesting than your shit ass moorles law isnt a law post.

  • Idea (Score:5, Funny)

    by Tablizer ( 95088 ) on Wednesday September 03, 2003 @12:32AM (#6856424) Journal
    I got it! Put the stuff inside a small glass vacuum bubble and make it hot so that electrons jump from one plate to another when............nevermind
  • I hate to say it (Score:5, Insightful)

    by mao che minh ( 611166 ) * on Wednesday September 03, 2003 @12:32AM (#6856431) Journal
    But "big deal". Many such aspiring endeavors have been undertaken at the expense of a large corporation's purse, only to fail miserably. I applaud their attempt to better technology and wish them the best, but I'll reserve judgement on the ultimate worthiness of thier crusade until they actually do something.
    • Does anyone know how or if the 'crossbar'
      technology that they speak of relates to
      the 'crossbar' phone-switching technology
      of the pre-ESS era?
    • I have to disagree (Score:5, Insightful)

      by abhisarda ( 638576 ) on Wednesday September 03, 2003 @01:46AM (#6856699) Journal
      To some point you might be right but your statement is too generalized.
      Where do you think chip innovation is coming from? Intel, AMD, IBM... Are these small firms? No.
      Universities and small firms can only do so much research because as the sizes of transistors and chips decreases, fabrication and research costs increase exponentially.
      And if you read the article, it says that 12.5 million was provided by the govt and matching funds by HP.

      Do you think HP is breaking the bank by providing that kind of money?
      This endeavor is not Itanium sized in terms of a cash sink.
      You got to start somewhere. If you think the microprocessor industry is where it is without its share of research and faliures, its not true.
  • Results? (Score:1, Interesting)

    by Ro'que ( 153060 )
    It seems like we're constantly hearing this same type of story over and over again but never hearing about any substantial results...Be it diamonds, gel, or nano-technology. What does Gun-Young's research mean to me, the almighty consumer? Nothing but a few more years of speculation before anything actually happens.
    • Re:Results? (Score:5, Insightful)

      by Jerf ( 17166 ) on Wednesday September 03, 2003 @01:06AM (#6856569) Journal
      That's a good question and the answer is "technology media coverage sucks".

      Far-out technology ten or twenty years from plausible implementation makes a much better story then technology that's appearing on the shelf today, which is drowned out by the marketing message and if you're lucky, some semi-meaningful buzzwords.

      However, the electronic industry is actually quite good about converting technology into actual products. It just isn't talked about as much because it's so "ho-hum". Let me remind you that 2,400,000,000,000 [compusa.com] bits that fit in the palm of your hand is something so amazing that you really can't even understand it in any real way.

      Look into the technologies in current use for hard drive manufacturing, processor manufacturing, and the other such hardware you use day to day (including non-computer stuff). You'll find enough stuff to make a 1970's sci-fi author wet their pants. It just doesn't make good copy.
    • "What does Gun-Young's research mean to me, the almighty consumer?"

      It tastes great; and it's less filling.

      KFG
  • by Anonymous Coward
    It's Isaac Asimov's molecular valves! The next step after transistors!
  • Hmmm (Score:2, Funny)

    by annisette ( 682090 )
    Why not start out making the smallest then find ways to make them bigger?
  • by Wargames ( 91725 ) on Wednesday September 03, 2003 @01:11AM (#6856587) Journal
    The page ref'ed this bit
    on 200 gigabit nanotube memory cubes.

    I am not so sure I want my chips to be living organisms. On the otherhand I am certain that the choice between faster organic computer and slower inorganic computer would be a no-brainer. I'm just rooting for the inorganics right now. Thought then there is ice-nine goo and all that to be concerned about which is not much better than a computer virus destroying all life forms.

    A 'puter [not including DNA synths which incidentaly should be cautiously defended since they are potential hacking targets to 3li4e geno-hackers] passing a virus directly to a human (or some other animal) becomes a probability when the computer has a DNA factory as part of its makeup.


    Amplification seems like a reasonable quick solution to hard problems of routing traveling salesmen, but make sure you don't get any of it on you.

  • Not the trying to make things smaller, but trying to get it to cost less than the $3 billion quoted in the article for current processes.
    • Er, there is no current process for making the stuff the article is about, other than some poor guy on his knees begging a CVD reactor to work right. The $3B mentioned is the cost for building a fab for a modern semicondoctor process which, while certainly a lot, is not a real problem since there are quite a few companies prepared to make such investements given the juicy returns (IBM, NEC, LSI, TSMC, . . . ).

  • by Fourier ( 60719 ) on Wednesday September 03, 2003 @01:16AM (#6856611) Journal
    "Componentry?" Er, what? I'm going to label this one a bullshit buzzword. It does not [reference.com] seem to appear [m-w.com] in the dictionary, and the obligatory GoogleFight would seem to confirm [googlefight.com] that "components" is the accepted term.

    Timothy, perhaps you are confused by standard English usage patterns. You see,

    toilet [reference.com] -> toiletry [reference.com] and
    bigot [reference.com] -> bigotry [reference.com],
    but
    apple [reference.com] -> apples [reference.com] and
    component [reference.com] -> components [reference.com].
    • by panaceaa ( 205396 ) on Wednesday September 03, 2003 @01:33AM (#6856662) Homepage Journal
      Putting "ry" on the end of the word doesn't make it a plural, even in your two cases. Instead, it describes a part of it's root word. Bigots have bigotry. Toiletries are part of a toilet (the room, "I'm going to the toilet.").

      Likewise, componentry is used in the fabrication of components. It becomes a part of finished components. That's why it's found on 30,000 Google pages.
      • by Fourier ( 60719 )
        Putting "ry" on the end of the word doesn't make it a plural, even in your two cases.

        Well yes, that was my point after all.

        Likewise, componentry is used in the fabrication of components.

        OK, that sounds plausible at least. Now, are you able to back up your claim by providing some links where "componentry" is used in this sense, rather than in the "I think it's a more marketable word than components" sense? My random sampling of Google hits seems to favor the latter.
      • Duh. (Score:2, Funny)

        by C10H14N2 ( 640033 )
        -ery or -ry
        suff.

        1. A place for: bakery.
        2. A collection or class: finery.
        3. A state or condition: slavery.
        4. Act; practice: bribery.
        5. Characteristics or qualities of: snobbery.

        It would then be proper to say this thread is the height of stupidery.
    • Another example:

      Pedant -> pedantry

    • apple -> apples

      That should be:

      apple -> appletry
      Though the actual process is:
      apple tree -> apple
  • But... (Score:4, Funny)

    by Mister Transistor ( 259842 ) on Wednesday September 03, 2003 @01:39AM (#6856679) Journal
    I DONT WANT TO BE ANY SMALLER!!!!

    I'm very happy the way I am now, thank you...

  • ... are some tiny little beach blankets and some tiny little Annette Funicellos.
  • by rmc6198 ( 701782 ) on Wednesday September 03, 2003 @01:42AM (#6856690)
    The basic computing element will of course keep getting smaller and faster, until it reaches certain physical limits which cannot be exceeded. At this point, a new paradigm will be invented to provide the way beyond the limits.

    How small can something be? It can be down to the molecular level. How fast can something go? Up to the speed of light. So eventually the fastest "transistor" will be composed of individual molecules, with changing states caused and communicated by light (photons).

    Electricity was stated in the article as "the way" that information will be input and extracted from tiny transistor, but I think this paradigm will change! Once you get to a certain speed and smallness, electricity loses its ability to transmit information. This happens due to sluggish time response properties of the medium (capacitance and inductance and other jazz) and wave interference and delay of the electrical wave of electrons flowing.

    Once a wavelength (directly related to frequency) becomes a certain fraction of the distance it has to travel, the electrical path becomes a "transmission line" instead of a "lumped element." Basically you are trying to send waves of electricity (1's and 0's) down the line too fast for the physical capabilities of the medium. So that's one more thing that complicates the process of making computers smaller and faster--getting the information out and transmitting it to other components.

    That's why I was mentioning a new paradigm...because I was thinking of reading Isaac Asimov's stories that mentioned his ultimate computer, Multivac, which filled up miles and miles of space underground. He extrapolated the ideas that made the cutting edge computers of his time into what he thought the future's computer would be like--namely, huge. But of course he couldn't predict the advent of the transistor and later the microprocessor which changed everything and made everything shrink instead of getting bigger....by the way--some parts in computers, like the connectors and traces, are already becoming speed bottlenecks for some of the reasons mentioned...
  • by Anonymous Coward
    Williams reseach is good stuff, and essential. Something for the ./ crowd to realize, however: the wires forming the cross bar are still macroscopic. Although the switching component is based on molecules, the interconnects are large, and essentially will limit the density of transistors per area.

    This means that Moore's law will still hold, unless the interconnects are molecules as well.

    IAAMEE
  • reality vs hype (Score:4, Interesting)

    by Garbonzo Pitts ( 249836 ) on Wednesday September 03, 2003 @02:05AM (#6856738)
    A huge element of Si technology's success is the way lithography allows mass production. The problem with molecular schemes is that they involve pieces that have to be added to the substrate. William's approach of using crossbars as the basic element gets around this problem somewhat. But Si + lithography is still going to be a more robust technology.

    There is also the problem that molecules are delicate objects. You simply can't make millions of molecular switches and expect them all to work. With Si all the switches work often enough that you can make chips. Williams plans on using fault tolerant architectures to get around this problem.

    So, HP's program isn't as crazy as a lot of stuff I see at conferences. But it is still far fetched, and I think it will fail because it is competing with Si VLSI instead of aiming for some niche.

    Si technology is damned good, and trying to compete with it has been a losing game for decades now. (e.g. GaAs and Josephson junction computers). "Novel" technologies pay off when used for an application for which Si is unsuitable (optics with GaAs, magnetic field detection with Josephson junctions).

    However, I will eat my hat if in 20 years (10 years after Moore's 'law' bottoms out) VLSI is done in anythin other than Si.
  • I don't know why, but this paragraph jumped out at me while reading it:

    Once inside we make a beeline for a machine called a chemical vapor deposition reactor. It looks like a big steel cylinder on its side, encased in glass. "I have a special relationship with this machine," he says, and touches the glass with a gloved hand.

  • by StewedSquirrel ( 574170 ) on Wednesday September 03, 2003 @02:12AM (#6856761)
    *****"I think we've picked the winner, something that will allow this thing we call Moore's Law to continue on for another 50 years. I used to think it was impossible. Now I think it's inevitable."****

    This seems to be a stretch of the imagination. Moore's law defines, specifically "the number of components per integrated function" doubles every 12-24 months (is historically slightly more than 24 months), but is also (perhaps improperly) used to say that performance of processors doubles in that time.

    In any case, following the progression of Moore's law from 1965 to today and through for the next 50 years reveals a minor (perhaps major) flaw in this scientist's assertion.

    1971: 2,250 - Intel 4004
    1982: 120,000 - Intel 80286
    1993: 3.1 million - Intel Pentium
    2003: 55 million - Intel P4 Northwood
    2013: 1.76 billion
    2023: 56 billion
    2033: 1.8 trillion
    2043: 57.6 trillion
    2053: 1,840 trillion

    The atomic diameter of an average old atom of some metallic element that would be used in transistor fabrication is about 10^-10 meters. The atoms in their molecular "crossbar" technology would be much larger, plus inter-atom spacing of about 0.3nm... we can assume there would be an element every 1nm.

    With 1.84 quadrillion elements per component, we're talking 42 million components on a side, assuming uniform density and perfect 100% usage of space on the atomic level, these chips are just about half a meter in size.

    Ok, so I proved myself wrong! Moores law has the TECHNICAL possibilty of holding true for the next 48 years. Beyond which, atomic structures themselves make the process of shrinking the components all but impossible.

    Stewed Squirrel
    • Who said it has to stay two-dimensional - bring on the processor cube!
    • by Anonymous Coward
      you are assuming a 2 demensional chip. In 3 demensions it would have about 122000 atoms per side, which makes it about 1mm cubed. You may ask how we access all those atoms on the inside, but really, who cares, we can let our grand-kids figure that out.
    • This is quite closed-minded, one must take into consideration the different chip making techniques used, and how they have changed since the early 86's,

      With changes in techniques, and how one does it, many things are possible

      For one example, just look at the changes in building manufacturing. At one time they couldn't get something super big, because of stone's limitations, then because of iron's, and now we make buildings exponentially larger with improved steel, created with new techniques.

      It's all in
      • The irony of this whole sub-thread is that I did a very brief (and not so accurate) mental math and figured that the number would be on the order of several hundred meters, not half a meter. In that case, three dimensions would help, but wouldn't bring it down to a reasonable size.

        It wasn't until I had already done the math that I realized I had already proved myself wrong.

        But, remember that light only travels at.... the speed of light. And the current crop of chips using copper interconnects propigate
        • And the current crop of chips using copper interconnects propigate at maybe 1/2 or 2/3 that speed, meaning in a theoretical 1,000GHz chip (not too far off by Moore's law), electrical impulses can only travel about 50um.

          Moore's Law says nothing about operating frequency. As pointed out already [slashdot.org], Moore's law refers to the total number of components (switches) that can be fit on a die.

          Checking a layout I'm working on now (0.13um all-Cu process with 0.28um wire pitch), I see that I have a 4mm wire with
          • Actually, I turned up the 100um number, but realizeed that a "cycle" includes the rise AND fall of the clock, which makes the "half-cycle" time 0.5ps (500fs) and is where I pulled the number 50um.

            Squirrely
          • Also, about moore's law.

            I did some reading and found that Gordon Moore himself updated his "law" in 1971, revising the doubling time from 12 months (in his original 1965 observation) to 24 months (which, averaged, gives the oft-wrongly-cited 18 months) and he also commented that it seemed to include "performance" as a relative figure.

            My mention of Clock speed in terms of that is simply a loose coorilation between frequency and performance.

            Stewey
  • by dexamyl ( 703323 ) on Wednesday September 03, 2003 @02:28AM (#6856788)
    If universities around the world spent as much money researching operating systems and programming methods as a single big chipmaker spends on a new fab, we'd all be able to get a lot more power out of the chips we already own.

    The catch is, it's a lot easier to make money selling silicon (or diamond, or DNA, or nanotubes, or whatever...)

    • That is true, but there is only so much processing power one can squeeze out of a given CPU, as long as there are technology gains to be made, lets keep making them.

      I've only got another 60 or 70 years before I need to transfer my conciousness to a computer you know.
  • by d3am0n ( 664505 ) on Wednesday September 03, 2003 @02:44AM (#6856827)
    You know, alot of people talk about the death of moore's law, but uh, has anyone ever considered the possibility that moore's law might keep going and going and going ad infinitum?

    It isn't impossible. Theoretically when you get down to quantum computers where your using atomic mater itself your almost at the smallest possible size for computation, until you break down the individual peices of the positrons and electrons into quarks and gluons which could possibly be used for calculation, then you think about creating an artificial black hole and stuffing ever more matter into a singularity and you could calculate the universe from something the size of the head of a pin (especially if you adhere to the multiverse theory, which states there are infinite realities). If there are infinite realities, we could litterally collapse our own reality, and possibly others nearby into a singularity for calculation, and just keep on going and going and going.

    Truly as we begin to see the emergence of quantum computers we start to head towards these paths for higher and higher calculations, instead of knowing a universe around us, abit at a time. We could know it all at once, in all it's enormousness. We could then know and create others (computation being equivilant according to babbage, a computer simulating a reality perfectly is in fact a new reality as our reality is nothing but mathematical laws anyhow).

    While I know moore's law can fail us at any time now being a theory and not a fact. Dismissing it as most do so casually after it has perservered time and time again for so many decades running is really getting to be rather ridiculous.
    • Transistors might be based in quantum mechanical effects, but are not purely QM devices, but more like pseudoclassical devices. This means that you can describe a lot of it using classical physics and a few quantum mechanical approximations.
      Moore's Law is a rule of thumb for transistor size. Our current computational technologies are based on a transistor-like gate. We get into a new computing paradigm.
      Transistor based machines (and even tube based computers, yes, the old dinosaurs) can be modeled by a
      • Well generally we don't know exactly what future technologies can do, that's why I used the absolute smallest as an example, rather than thinking about the potentially smallest with our technologies. I'm just saying that if it were possible to litterally just have a pile of atoms and read and write to them while simultaneously taking advantage of quantum properties that we could go infinitly smaller provided we could do this...and who knows if we could? there are alot of things people said were impossible
    • then you think about creating an artificial black hole and stuffing ever more matter into a singularity and you could calculate the universe from something the size of the head of a pin (especially if you adhere to the multiverse theory, which states there are infinite realities).

      Wait--I'm confused. You're going to perform computations inside of a singularity (black hole). Okay. How are you going to get your result back out again?

  • It seems that this should be something we are rooting for, not against. Regardless of how far-fetched it seems, I am glad someone is making a run of it. What if something comes of it? After all, it's not our money, right?

    Wait, it's funded by DARPA?

    Oh well, anything that keeps them from putting energy into causing panic [darpa.mil] and classifying me as a terrorist [darpa.mil] can't be all bad, I guess.
  • by Gwala ( 309968 ) <.adam. .at. .gwala.net.> on Wednesday September 03, 2003 @04:30AM (#6857090) Homepage
    Has anyone considered how long we can keep streching this, sooner, or later (I believe latest estimates are 10 years), we are going to hit a bottleneck caused by electrons jumping paths, If we keep minimising like this;
    Therefor, we have three options I see.

    First - we opt to double die size, and hence see an appropriate improvement with minimal heat issues. Although lag between outer sectors of the processor is an issue. (This same solution could be applied to building 3D chipsets, but heat would be an issue.)

    Second - we use optical based chipsets, this has the advantage of letting us minimise a lot more, however the technology hasnt been perfected, and it is VASTLY different to what we are currently using, and could suffer from external interference caused by heat (contracting/expanding glass/plastic tubules will form a primitive lens).

    Third - we opt for more efficient systems, Hyperthreading is a good example of this, allowing a processor to use sections that are otherwise unused to do several operations at once. However, this requires a change in programming practices to allow for the change to multithreaded applications as standard, something which most programmers are not willing to engage nor understand.

    Of course there are more solutions, however I still see we are going to be very limited with copper, silicon or germanium[sp?] circuits in the next decade.

    -Gwala
    • "electrons jumping paths"? You're not in the semiconductor industry are you? :) Kindly accept the following reality check:

      1. Double die size? Heat's not the problem (bigger die surface area makes it easier to dissipate heat, not harder, assuming same transistor density and switching rate). Manufacturability (and thus price) is the problem. Currently, the largest silicon die anyone can make at a price anyone will pay is just under 20mm on edge (400mm^2). Yield for such dice is around 20-30% (so, 70-
    • Gwala said:

      Therefor, we have three options I see.

      First - we opt to double die size...

      Second - we use optical based chipsets...

      Third - we opt for more efficient systems, Hyperthreading is a good example of this...

      This kind of thinking is deeply (but poorly) rooted in the details of Moore's Law. Moore made an observation about specific manufacturing processes of the kind mentioned here. (Except Moore had a better understanding of the technical details - for example "doubling die size" would quadr

  • by Art Tatum ( 6890 ) on Wednesday September 03, 2003 @07:21AM (#6857519)
    Williams's group faces a monumental task: trying to make computers whose functionality rests on the workings of molecules.

    My computer is chock full of molecules already and it's quite dependent on them for it's functionality.

    • Actually, I'd rather like a PC that works without relying on molecules.
      They've considered photons, DNA and the third dimension - why not consider metaphysic?

      The new ZX Specter: Now with UDRAM (UnDead RAM; the data doesn't die when you turn off the power), Real Voodoo video card (speed dependent on the amount of chickens sacrificed on the GPU) and the processing power of a 2.5 GigaSoul FPGhA (Field Programmable Ghost Array). Available at your local Pagan store for the price of... your soul, of course.

  • Its nice to see the research but HP is now mostly a producer of commodity IT products. There is practically zero chance that they will be able to effectively market a new processor architecture. Perhaps they want to go the IBM route and license the tech, who knows.

    Marketing and market share matters. An Intel chip with 20% improvement is likely to sell much better than an HP chip that doubles performance.

    • Did you RTFA? I don't think so, since it's not in any way about developing a new processor architecture. Rather, it's about developing a new type of physical switch to replace a silicon transistor. If they got it to work, they could make an x86 or any other type of processor out of it -- but that's really irrelevant this early in the game.
  • Anyone notice recently that, for all of these technologies that are supposed to keep moore's law running well into the future all by themsleves, if they were all added up, we'd be way beyond moore? Could moore be less that what's possible?

"Take that, you hostile sons-of-bitches!" -- James Coburn, in the finale of _The_President's_Analyst_

Working...