Forty Years of Moore's Law 225
kjh1 writes "CNET is running a great article on how the past 40 years of integrated chip design and growth has followed [Gordon] Moore's law. The article also discusses how long Moore's law may remain pertinent, as well as new technologies like carbon nanotube transistors, silicon nanowire transistors, molecular crossbars, phase change materials and spintronics. My favorite data point has to be this: in 1965, chips contained about 60 distinct devices; Intel's latest Itanium chip has 1.7 billion transistors!"
Keeping Count (Score:5, Informative)
That's Montecito dual core Itanium, w/24MB of cache (only about 120 million transistors actually per CPU with the balance largely that motherlode of cache) and you could probably fry a steak on.
"We can keep Moore's Law alive just by stuffing the cache!"
"Brilliant!"
"Brilliant!"
Suddenly they were crushed by a giant can of Guinness containing not even an electronic sausage...
Re:Keeping Count (Score:4, Insightful)
If it actually works, then there's little to complain about. Unfortunately, I don't think that things are quite so easy...
Re:Keeping Count (Score:3, Insightful)
Re:Keeping Count (Score:5, Informative)
The problem with bigger & bigger cache is that it has diminishing returns. This is why Intel's "Extreme" chips are a waste of money.
The inability to do anything useful with all those transistors is why we're seeing the advent of multi-core chips, which are neat but fail to preserve the conventional single-threaded programming model. This places the burden of creating explicit parallelism on the programmer, and leads to more complicated code, which means it costs more to write and also contains more bugs.
Re:Keeping Count (Score:5, Insightful)
It can only work for so long. The biggest problem that is keeping performance down is not the processor but the memory retrieval and writing system: only one memory location can be accessed at any one time. This is also known as the von Neumann bottleneck. Not even clustering can get around this problem because there is a need for inter-process communication that slows things down. If someone could come up with a system that allows unlimited random and simultaneous memory access, the physical limit to processor speed would not be such a big deal anymore. We would have found the holy grail of fast computing.
Re:Keeping Count (Score:2, Interesting)
Re:Keeping Count (Score:5, Funny)
Re:Keeping Count (Score:3, Funny)
they are just very, very small.
Actually they're rather large, but cleverly Intel have found a way to story them in an alternate universe using Portable Blackhole Technology(TM). Cross your fingers and hope nobody in that alternate universe stumbles across them.
Re:Keeping Count (Score:4, Funny)
Not to start a flamewar here, but AMDs Micro Singularity Architecture(TM) is vastly superior to intel's PBT.
Re:Keeping Count (Score:3, Funny)
Re:Keeping Count (Score:3, Funny)
Do you have a source for the 120M transistors ? (Score:3, Informative)
The way I see it, 24 MB = 1024*1024*8*24 * 6 transistors/SRAM cell = 1.2B transistors for cache, still leaving 500M for logic. Well, we can factor in address storage and cache access logic, but I'd still like to see some harder data than this.
Paul B.
Re:Do you have a source for the 120M transistors ? (Score:5, Informative)
I don't know how many additional SRAM cells Intel is planning in each of the cache levels, so the 1.2B transistors for cache can climb up to 1.4-1.6B.
Someone posted a number of 1.47B transistors for the L3 cache at Real World Tech [realworldtech.com]. I'm not sure how credible or accurate that number is.
Another article on RWT shows approximate die floor plan and othat info at:
http://www.realworldtech.com/page.cfm?ArticleID=R
Re:Do you have a source for the 120M transistors ? (Score:2)
Re:Keeping Count (Score:3, Funny)
Number of transistors= 1.7 billion
Number of units sold = 1.7K
Money invested= gazillion dollars
Tasting dirt from your puny competition (read AMD)= priceless
Don't hold your breath... (Score:5, Informative)
Re:Don't hold your breath... (Score:2, Informative)
Maybe not, but there's certainly been a bit of a bump in progress recently; no notable new desktop CPUs, and certainly no increase in the complexity, component count or speed - unless you want to count cache - nothing in the last 18 months has fulfilled the criteria set out in Moore's Law. Having said that, this anomaly only applies to CPUs.
I would hazard a guess that the law still holds true in memory - major advances there in
Re:Don't hold your breath... (Score:2)
Re:Don't hold your breath... (Score:2)
Physics to your heart's content.
Re:Don't hold your breath... (Score:4, Interesting)
And what about Nvidia? They're last product jump from 5900 to the 6800 was absolutely amazing. A very clear %100 increase in performance. I'd be very surprised to see Nvidia be able to match that leap sooner than 4Q 2006.
Re:Don't hold your breath... (Score:2)
Maybe, maybe not.
Clock speeds of GPUs have been inching upwards just like those of CPUs, but the number of pipelines has been growing rapidly - from 8 to 16 in your example. They haven't hit a practical limit there yet, though power consumption is getting to be a worry. You might well s
Re:Don't hold your breath... (Score:2)
My beef is that I'm still running a dual PIII setup. (1.2ghz) The latest and greatest desktops from intel or amd really don't offer enough of a performance boost to warrant _upgrading_ from a dual to a single chip. There are no desktop dual chip boards for P4. So, I can get more raw speed but sacrifice awesome multitasking performance. Wtf? And no, a single Hyperthreaded P4 does most certainly not compare to 2 dedicated chips. Its just a neat trick that can help in some cases. (Been using HT Xen
Re:Don't hold your breath... (Score:2)
The ARM was designed in the late 80s as a general-purpose CPU for the British Acorn computer (originally ARM stood for Acorn RISC Machine).
Because of its very efficiently coded instruction set, it turned out to use very little power. This is why, in subsequent years, it started to find its way into embedded applications.
After Acorn went bust, ARM remained as the only profitable part of the company, focusing mainly on embedded applications of i
Re:Don't hold your breath... (Score:2)
You know what bothers me even more? (Score:3, Informative)
Sure, taking Moore's law literally, computers are 1 million times faster than 30 years ago. Arguably that should translate into _more_ than 1 million times more work per second, because compilers have evolved too, and expensive optimization techniques have become more affordable. (A compiler optimization technique that would have taken a week on a 70's mainframe, now takes seconds.) We also have better tools.
But are we doing 1 million times more with the
Kinda obvious.... (Score:3, Insightful)
Before anyone says, well we've adjusted the length of time for doubling already, we'll do it again. For what its worth, its a bit silly saying X=2^Y/T is a law if you redefine T everytime it doesn't fit.
Re:Kinda obvious.... (Score:4, Interesting)
Moore's original observation, that transistor density doubles every 18 months, will obviously cease to apply once it becomes impossible to make transistors. But as long as that feedback loop continues to churn, it continues to make sense to talk about Moore's law.
Re:Kinda obvious.... (Score:5, Insightful)
His observation was made to Electronics magazine, in the April 19th, 1965 edition.
He didn't mention transistor density.
He didn't mention processors (as microprocessors were still 6 years away from being invented).
He was describing component integration on economical integrated circuits.
He observed that component integration doubled approximately every 12 months. He increased that number to 24 months, in 1975. Since then, other people have split the difference to 18 months.
None of those figures, 12, 18 or 24 months, are accurate.
If the 18 month figure was accurate, today's chips would have 75 Billion transistors.
With his original 12 month figure, 27 Trillion.
With his revised 24 month figure, 37 Million...
Also, this isn't even a law... it's an observation.
Please note... I relied on Tom R. Halfhill's column in Maximum PC (April 2005) "The Myths of Moore's Law" for this reply.
The Lesser Known Part 2 of Moore's Law... (Score:5, Funny)
law? (Score:2, Insightful)
Re:law? (Score:3, Funny)
Re:law? (Score:3, Informative)
Re:law? (Score:2)
You checked the wrong book.
Mathematical theories are very different to scientific theories.
Educated people (Score:2)
This is how we get a better and more refined understanding.
Re:law? (Score:2)
Solving problems. (Score:2, Insightful)
What I think is more interesting is how far ahead we can solve them. The clock distribution problem was a problem for seen and solved years ahead of it biting hard. Nowadays the problems arise and we have shorter and shorter time to react before they cause serious problems.
This is the strongest proof I found that this te
Slashdot corollary (Score:5, Funny)
Re:Slashdot corollary (Score:3, Funny)
Is there already a Law that says... (Score:4, Funny)
If not I herbey proclaim it Goat's Law.
Re:Is there already a Law that says... (Score:2)
*sigh*
Data point? No, two points! (Score:5, Funny)
Uh, wouldnt that be two data points?
Nope. (Score:2)
The second point is a datum, the first point is a reference. If you say "this site is 150 meters above sea level", how many data points do you have?
Re:Data point? No, two points! (Score:2)
1.7 Billion? (Score:4, Funny)
No wonder they call it the Itanic! Both were big and huge and failed miserably.
Re:1.7 Billion? (Score:2)
It's not a law... (Score:5, Insightful)
By the way, if the Itanium has 1.7 billion transistors, (I'll take the poster's word for it) then one has to ask - are they all pulling their weight? It seems a hell of a lot for what it does. Surely one way to squeeze more out of Moore's Observation is to come up with more efficient architectures and use fewer devices, working more efficiently/smarter/harder. Just a thought.
Re:It's not a law... (Score:2)
That would be true, if it weren't for the fact that it is a law.
A law is just a general or universal statement of the way things are. Some are imposed by man, some are imposed by nature, and others are based on the observation of trends.
That's what makes Murphy's Law a law, what makes Godwin's Law a law, and, yes, what makes Moore's Law a law.
Laws don't even have to be right to be a law.
it's an observation
It's that too. These aren't mutually exclusive things.
Did you know the term 'l
Michael Moore's Law? (Score:3, Funny)
Re:Michael Moore's Law? (Score:2, Funny)
Moore's Law is Dead (Score:5, Funny)
Good reason for that (Score:2)
Self fulfilling (Score:5, Interesting)
Also, for the record as a physicist, quantum computers won't remove the need for conventional computers in most areas - a big thing is (as I understand it) that they're not programmable, and have to be built to a certain specification. Therefore, classical computers will always have their use.
It definitely has less that 300 - 400 years. (Score:5, Interesting)
Re:It definitely has less that 300 - 400 years. (Score:3, Insightful)
Re:It definitely has less that 300 - 400 years. (Score:2)
Re:It definitely has less that 300 - 400 years. (Score:5, Informative)
Re:It definitely has less that 300 - 400 years. (Score:2)
Thanks for the useful maths. However, why did you feel the need to correct the orignal poster's 300-400 year estimate? Your more refined estimate does not invalidate the OP. And besides, given the large number of (intelligent) guesswork in the calculations, I think it'd be more realistic to use 300-400 years than 335-356 years.
Anyway, app
Graphs???? (Score:4, Interesting)
Re:Graphs???? (Score:2)
Re:Graphs???? (Score:2)
Taking some general numbers:
Current Number of Transisters: 1.7G
Initial Number of Transisters: 60
Time Frame: 39 years.
So now to get from 60 to 1.7G, you have to double 60 approximately 24.755997 times. That means that for a 39 year period, we have the chip density double every 1.575376 years or every 18.9 months.
Looking at it another way, the actual time line is 22 months behind the prediction or the chip boys are off on their prediction by 4.7%.
Based on my experience, any proje
Re:Graphs???? (Score:2, Interesting)
http://www20.tomshardware.com/cpu/20041220/index.h tml [tomshardware.com]
Now to explain why you're asking the wrong question: Moore's observation says nothing directly about performance. He merely suggested that the complexity of ICs double every 18 months or so. In general, this has nothing to do with a comparable trend in clock speeds on CPU
Re:Graphs???? (Score:2)
Re:Graphs???? (Score:2)
Re:Graphs???? (Score:2)
Re:Graphs???? (Score:2)
Bugs (Score:5, Interesting)
And don't give me any crap about that software is somehow inherently harder to keep bugfree. I develop both and there really is little difference when it comes to complexity.
Sure, software performs more complex tasks, but when you add 'parallel-ness' of hardware, as well as timing issues, temperature and manufacturing issues, clock distribution, leakage and crosstalk, hardware defenetly is a pretty good match.
The simple truth is that there is simply vastly more testing that goes into hardware then most software (software in mars rovers and lunar landers would be an exception). And I bet that there are better design methods and safty guards too.
Re:Bugs (Score:5, Insightful)
- Software usually performs a more diverse set of options
- The environment where hardware runs is more predictable than the software one
- Formal verification is probably easier to perform with hardware.
- It's easier to verify low level stuff than high level abstractions.
I'd add more, but I've got other things to do unfortunately...
Reactive Programming (Score:2)
True. And this is the reason that we should be writing software pretty much the same way logic designers design logic circuits. That's the basic idea behind synchronous reactive programming languages like Esterel, Signal, Occam and others. Also check out Project COSA at the link below.
Re:And most programmers are crap (Score:2)
Re:Bugs (Score:2)
The truth is not so simple. Given that the largest part of a modern CPU is cache, as opposed to logic, the transistor count does not reflect the net complexity. If one considers the ISA of a CPU to be it's specification, a chip is a far less complex construct than a non-trivial piece of software. ISA evolution is measured in years and decades. An equivalent piece of software has a relatively small nu
Re:Bugs (Score:2, Insightful)
While you're right about most of the transistors being cache, the fact is that chip designs do go through a lot more testing (ie simulation) than most software.
Largely it's economics. It's been a few years since I was involved in chip design (0.25 um) stuff, but IIRC it cost a few hundred $k just to make the masks for a silicon rev. At least 90% of the effort went into simulations and testbenches that are run before you see first silicon. The only software that gets that kind of
Re:Bugs (Score:4, Insightful)
After the FDIV bug, they added a means of "patching" the instruction set in software as part of the BIOS boot procedure. Of course, there is no substitute for testing the hell out of it as much as possible before releasing.
Software can be just as reliable if you put the effort into it. Usually it isn't done, because it is usually easy to patch the software on the fly, but a bad ASIC bug means an expensive respin.
Hardware design is actually software design anyway - they have special languages for it such as Verilog and VHDL. If you have a foot in both camps, you would be suprise how little difference there is between hardware and software design methodologies.
It must be a hardware issue (Score:2)
Re:Bugs (Score:2)
Austin Powers (Score:5, Funny)
"I demand the chip have...SIXTY TRANSISTORS!" (pinky lightly touches corner of mouth).
The guys at Intel start laughing hysterically...
"I've changed my mind...I demand the chip have...ONE POINT SEVEN BILLION TRANSISTORS!" (pinky lightly touches corner of mouth)
Intel guys gasp in shock...
Tracing it back... (Score:2, Funny)
Moore's Law is probably being exceeded at... (Score:5, Insightful)
Re:Moore's Law is probably being exceeded at... (Score:3, Interesting)
Or it means... (Score:2)
Code written for GPUs is inherently streaming code, and hence easily parallelisable, so many of the complex dependencies that make CPUs tricky to speed up go away.
I think you put the cart before the horse there. The task you're trying to solve must be easily parallelizable and thus free
No cart before horse (Score:2)
Law of Accelerating Returns (Score:5, Interesting)
Basically, it has been observed that any evolutionary process (including technology) will progress exponentially as it builds on past progress, with barely perceptable slow-down/speed-up "S-curves" as paradigm shifts occur.
Moore's Law is certainly an important component of this trend, as it relates to computing power and eventual AI/IA accelerating to Singularity [singinst.org] in ~25 years, but there are many others in parallel: storage space, networking bandwidth, # of internet nodes, transportation speed, etc.
One thing that certainly ISN'T keeping pace with our technology is our old evolutionary psychology; hopefully we can fix [hedweb.com] some of the more disgusting aspects of human nature before it's too late [gmu.edu].
Re:Law of Accelerating Returns (Score:3, Insightful)
In addition... (Score:2, Informative)
Peak Oil folks take one valid idea (oil is finite, and running out will be painful), but then devolve into irrational fear-mongering about it. If thermal depolymerization c
Heard of it... (Score:3, Interesting)
Re:Heard of it... (Score:2)
So just shut up and don't pretend to understand things that you really don't. Thanks.
Re:For your information... (Score:3, Interesting)
Going back to your original post, the evidence that faster hardware means human and then more than human AI is as strong as it can be at this stage. We haven't found anything odd in the human brain that can't be simulated (and already simulated some p
Gates Law (Score:4, Funny)
Another Good Quotable (Score:4, Interesting)
From Popular Mechanics, march 1949:
"...computers in the future may have only 1000 vacuum tubes and perhaps weigh only 1 1/2 tons."
that would explain the puff piece in Time Mag. (Score:2)
007 (Score:2, Funny)
Then you will have Lazenby's, Connery's, Dalton's, then (perhaps) Brosnan's law fail as well. Some laws can be.....broken, and twisted, and....um suckey. That last illiterative is mine....all mine, Mr. Bond.
I bet there were a lot of nerds celebrating... (Score:2, Funny)
...in 1956, when they managed to fit one component on to a device.
1.7 billion? Hmmmm (Score:2)
And it's as hot as one vacuum tube!
(insert drom-rulls and cymbal hit)
40? (Score:4, Funny)
28 doublings = 17 months (Score:2)
Re:Typical /. Subject. (Score:4, Funny)
Re:When was the last time Moore's law was correct? (Score:2, Interesting)
All it says is that the number of transistors you can fit in a fixed area doubles roughly every 18 months (or, expressed another way, the area of a transistor is halving every 18 months.)
Making transistors smaller does tend to mean you can run cirucits faster because you can switch state faster (which in turn, also reduces the dynamic component of your power consumption), but it's not just a simple linear relationship between size and speed.
Re:When was the last time Moore's law was correct? (Score:2)
It has to do with the number of transistors doubling every 18 months or so. This has held true.
Re:When was the last time Moore's law was correct? (Score:5, Informative)
First of all, Moore's Law implies that the number of transistors per integrated circuit will double every 18 months (which, is not really what he said, see Understanding Moore's Law [arstechnica.com]).
Second of all, this has held true and is continuing to hold true.
Third of all, clock speed does not reflect transistor number or density, neither of which are the sole contributing factor to 'power' or 'performance'.
I don't know what's sadder; wondering if the parent was actually a joke, or wondering how it got +5 insightful. Damn.
Re:When was the last time Moore's law was correct? (Score:2)
Re:Electronics Magazine! (Score:3, Informative)