Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Science

The History of Moore's Law 29

An anonymous reader sent us linkage to a Scientific American article that is an interview with Gordon Moore discussing all sorts of things relevant to a guy who gets a cool law named after him.
This discussion has been archived. No new comments can be posted.

The History of Moore's Law

Comments Filter:
  • What happens when Moore's Law runs out is that people will actually have to think about what they're doing with computers and whether it's necessary.

    As an example of what I mean, I worked at a company that spent an enormous amount of money (and about two weeks) upgrading to Office 97 from Office 95. It was a painful transition for various reasons. A lot of 486's that had been fine for their current uses had to be replaced, because they couldn't run '97 properly. We got bitten by some strange incompatibility between our version of '97 and our Japan office's version. Several systems that had been running fine started having all kinds of odd problems that seemed completely unrelated to the upgrade, but hadn't shown up until after the upgrade was done.

    So I went to our IT department and asked why we had upgraded. I specifically asked what features we needed that the old software didn't have and the new version did. It turned out that there weren't any. They went through all this upgrade basically because it was available, and because company management didn't want to be seen as being behind the curve by sticking with older software. The fact that the old software worked fine had no bearing on the matter.

    Hopefully, when Moore's law begins to fail, people will stop having their computer use driven by marketing and start making decisions based on what actual utility they get from them. So when Microsoft comes out with Office 2010 with new Active Kitchen Sink, people will actually check to see if they need this new feature that's going to slow their computer down by %30, instead of just buying it and a faster computer.

  • Posted by My_Favorite_Anonymous_Coward:

    100+Gig harddisk is very important, that's the minimal capacity for a harddrive-VCR, which is the true first generation of the hybrid of PC and "Joe" appliance.
  • SA: Assuming that the trend will continue for the next 10 years, what do you see happening with all those extra cycles? What are we
    going to do with that power?

    GM: That becomes an interesting question. Fortunately, the software industry has been able to take advantage of whatever speed and
    memory we could give them. They taken more than we've given, in fact. I used to run Windows 3.1 on a 60 megahertz 486, and things
    worked pretty well. Now I have a 196 megahertz Pentium running Windows95, and a lot of things take longer than they used to on the
    slower machine. There's just that much more in software, I guess.



  • "GM: I'm not sure. I feel the same way about my TV and hi-fi sets--all these damn remotes. I don't use them often enough to learn how, and I get so frustrated I could throw the thing through the set. My wife gets even more frustrated than I do."

    I mean, if this guy thinks that using a remote control is so darn difficult, it makes you start to wonder whats is really going on in the hardware industry. It seems all a matter of luck and chance. I hope its not so in real. I thought these guys were like members of mensa or we capable of becoming members. But the way it looks is that they just try to stick with the curve, instead of trying to strike out on a new road. Well, what do i know, i don't have a PhD, but i can use a remote control...

  • maybe this law is sort of a hinderance from companies to push the limit, without knowing what the limit "should be"
    because if you believe that you just need to double your performance, most companies won't spend the extra money in R&D to inovate, cause its too risky. Some companies may try, but too many have been burnt in this wonderful "PC" market. or should i say "WinTel" market. If it aint MS, and it ain't Intel, then it ain't a PC. Got to love folks who believe that tripe.

  • So he doesn't like remote controls.

    BIG WHOOP. The man is still one of the veritable gods of computer technology. If you want to make yourself feel superior to this PhD'd genius by the fact that you love using a remote control and he doesn't, you have a real self-esteem problem.


  • by parallax ( 8950 ) on Sunday April 18, 1999 @01:39PM (#1928348) Homepage
    It's amazing that something as revolutionary as the single chip computer could come out of an engineer staring at thirteen separate schematics and saying, "Ok, but what about doing this with one chip?" And then being in the right circumstances to do it.

    The single-chip CPU is arguably the most important development of late 20th century, and it's exponential improvement (Moore's Law) is what drives the information economy. So what happens when Moore's law runs out?

    If current trends are projected forward, by 2020 a bit of memory will be a single electron transistor, traces will be one molecule wide, and the cost of the fabrication plant will be the GNP of the planet. The speed of light imposes practical limits on how large you can make a chip and how fast you can clock one. This is why we'll have GHz chips, but fundamental physical laws prevent THz chips.

    More importantly, the physical limits that shut down THz electronic computers apply to _any_ classical computing architecture; optical computing and other exotic technology can't beat the speed of light, or single-particle storage problems.

    You can't win by going to SMP, because at best you get a linear increase with each processor; exponential increases in power require exponential increases in processor number, which require exponential increases in space and power consumption.

    The only basis in physics for continuing Moore's law past classical computing is quantum computing. In a quantum computer N quantum bits (qbits) equals 2^N classical bits. This allows you to build a computer which scales exponentially with the physical resources of the computer. Quantum computing isn't a solved problem, but if and when it is it will be a revolution as big as the first single-chip CPU.

    parallax
  • Would someone kindly explain to me why Moore's law has been taken seriously for so long? For anyone who is going to simply say "because it is right" I will answer right back with "have you ever heard of the concept of self fulfilling prophecy?" I also would like to know why this is a law, rather than just an accurate prediction. Victor Hugo predicted radio proliferation, especially for things besides communications, yet there is only an award in his honor rather than a "scientific" law. If this really is a valid law (which someone might explain to me), then I propose the Microsoft corollary to Moore's law which dictates that it was Microsoft's balooning software business (and their resource hogging applications) that drove the law. I find it hard to believe that without such a push, the law would have been considered "optimistic" throughout the eighties as Intel and Motorola would have sat upon their work without need to improve performance. Jesse
  • More importantly, the physical limits that shut down THz electronic computers apply to _any_ classical computing architecture; optical computing and other exotic technology can't beat the speed of light, or single-particle storage problems.


    You can most certainly build a THz computer. Your signals can't make round-trip circuits in one clock cycle; so what? You just have several much smaller parts of the chip running asynchronously with each other, and get power savings to boot.


    Re. single-particle storage, you are overlooking the fact that present chips are essentially two-dimensional. While there can be up to a dozen or so metal layers, there is only one diffusion layer in which transistors are fabricated. Build a three-dimensional structure, and you suddenly have a lot more room. I leave as an exercise the question of how to actually build chips like that and how to extract the heat generated by such chips.


    I agree that there will be a fundamental limit due to minimum feature size, and that at some point (barring exponential increases in efficiency), heat dissipation will become proportional to computing power, but I think that the limits are a lot farther away than you place them.


    Quantum computing is an interesting idea, and people now seem to be doing useful work on it. It remains to be seen how easily it can be adapted to the tasks that we currently need computers for (I realize that new tasks that can be solved will crop up suited to quantum computing capabilities, but that won't make the old tasks go away). At present, while interesting and having potential, it remains an idea.

  • There are 3 kinds of curves that occur in nature: sinusoid, decaying exponential, and S-curve (a rising exponential which eventually levels out).


    Chuck a Windows box out of a high window with substantial lateral velocity. Watch its flight path. Looks parabolic, doesn't it? :)


    I realize I'm nit-picking, but there are many cases in nature where you find polynomial curves, especially in relations between parameters (radiative heat emission comes to mind, among other things). Exponentials certainly exist, but I once had the misfortune to be in an argument with someone who thought that they were the *only* kind of curve that existed, and am still touchy about the subject.

  • What I thought was most interesting about the interview is Moore saying that now his law is partly self-fulfilling because the industry drives R&D and plant investment to meet the curve.

    Assuming that the law eventually collapses (I don't know enough of the hardware to guess why exactly...), what happens to long-range planning? What happens to all those Asian memory manufacturers that use Moore's law to make long-term plans? Seems like a lot of chaos is in the offing in the semi-conductor industry if enough people come to believe that they can't depend on Moore's law any more...

    And what is Microsoft, and all the other makers of bloat-ware going to do when they don't have Moore's law to bail them out and make their products (almost) usable as they go through upgrade after bloated upgrade?
  • There are no exponential growth curves in nature. All apparently exponential growth curves are really sigmoids ("S curves") which have not yet reached their inflexion.

    This is obvious really. Things cannot keep growing exponentially indefinitely; limits always intervene.

  • bleh..1997. This is news? :)
  • ... to the words "Read More" in the hyperlink on the home page. Heh heh. :-)
  • The March-April, 1999, edition of Technology Review (MIT's Magazine of Innovation) has a series of articles dwelling on the competing technologies that will allow placing instructions on silicon at the sub-micron level. Likely one or more of these technologies will succeed and one result will be a continuum of Moore's Law.
  • Just to figure out how to make faster computers?! Allright, they just need to keep up with id's law (They will release a game every 18 months that is so cool you have to buy a new faster computer) Really, how many of you have upgraded just to keep playing id stuff? My last four major upgrades coincided with releasess 386/16-->486/50 for Doom 486/50-->P100 Quake P100-->Cyrix 200(sucked, but cheap |at the time|) Quake II Cx200-->Cel333(+ o'clocking) Quake3.

  • Here are some predictions!

    One hundred years from now all the techie questions asked here of Mr. Moore will be moot because we'll be:

    - bionantech* computers and have to go see a doctor when it's time for an OS update ("Here for your new kernel, Mr. Clued? Sit down, jack into the wall and the doctor will be right with you. Would you like some coffee and a muffin? We have a newer model Sexy Susie if you'd like release before revision?")

    - permanently wired into a universally-accessable network where the concept of privacy is a quaint thought which provides a great and continous source for rather banal jokes traded amongst teenagers, exceptionally bored adults and certain breeds of bionanteched pets.

    - looking forward to vacations where the whole point is to crawl into a dark hole for two weeks and not be "wired" into anything, which often results in two week vacations becoming two year vacations as the unplugged run off into the hills of Peru to avoid the twenty-four/seven deluge of plactic-nosed, silicon breasted babes selling cars, toasters, tweasers and "your very own set of custom-crafted twins -- no mate required!"

    - envious of those lucky enough to have a job that lets one use one's hands and body in the process of doing something constructive, destructive, deconstructive, instructive, obstructive....

    - wondering who the hell ever thought that having a life span of four hundred years was a good idea and if it's possible to find and kill the bastard in such a way as to fully express the totality of hatred for one's tormentors to which a human can aspire and actually achieve.


    bionantech:

    bio, or biosphere - refers to the fifty percent of your grey matter which has been harnessed to provide storage and processing power for your "personal" onboard computer;

    nan - refers to the artificially grown interface between the biosphere (see above) and the "real you" whatever that may or might have been. Includes all accoutrements for sensory enhancement.

    tech - a term once used to refer to external mechanical devices of relatively recent design, now indicative of one's status as belonging to either of the two human groups "tech" or "notech," (or in the vernacular, "tech" or "trash")



    Yikes, I knew I should've taken that nap earlier. *shake*

  • but not for more than 3 years.
  • Stanford's CS Building - Gates Building
    Camridge's CS Building (in design) - Gates Building
    MIT's new LCS Building (just announced) - Gates Building

    Random trend? I don't think so.

Real programmers don't bring brown-bag lunches. If the vending machine doesn't sell it, they don't eat it. Vending machines don't sell quiche.

Working...