Knowledge Center

Knowledge Center

Moore's Law

Observation related to the growth of semiconductors by Gordon Moore.


In the semiconductor industry, we talk about Moore’s law all the time and yet few understand the history of the so-called law and what it actually says. In this way, it has almost come to represent whatever people want it to show with the general assumption that it predicts future increases in size, complexity or for some other metric.

Gordon Moore was the director of R&D for Fairchild Semiconductor, one of the early leaders in the development of integrated circuits. In a paper published by Electronics in 1965 titled Cramming More Components onto Integrated Circuits he wrote “Integrated circuits will lead to such wonders as home computers—or at least terminals connected to a central computer—automatic controls for automobiles, and personal portable communications equipment. The electronic wristwatch needs only a display to be feasible today.”

The observation came from previous designs that Fairchild had completed and when he drew a straight line through those points he projected that by 1975, the number of components on the chip would be around 65,000 which represented a doubling every 12 months.

But the paper did not just talk about increases in numbers. Costs were a very important part of the trend and the curves on which the observations were made were cost curves. This took into account considerations such as the expected yields of the devices. His statement is actually “The complexity for minimum component costs has increased at a rate of roughly a factor of two per year.” So, the projection for 65,000 was the number of components for minimum cost. Interestingly the pitch at that time was two-thousandths of an inch, die size was a round 2mil square, had just gone to two layer metal and wafer sizes were 1 inch.

Interestingly, power and heat were already considerations at that time. Moore wrote “shrinking dimensions on an integrated structure makes it possible to operate the structure at higher speed for the same power per unit area.” He also noted that not all circuits would benefit equally. One example is noted “Integration will not change linear systems as radically as digital systems.”

It was Carver Mead, a Caltech professor and co-author of the Mead-Conway VLSI design book, who first called it a law in the early 1970.

At the time of the original extrapolation, the circuits in question were very dense circuits, such as memories. By 1975, when working for Intel, Moore changed the slope of the curve to represent the realities of the circuits being designed at that time – notably processors. That changed the slope to being a doubling every two years. Some have called the gap between the memory projection and the line for digital designs the design gap.

Around 1975, David House, an Intel executive changed the law to be a doubling every 18 months. However, Moore never formally agreed to that.

In 2005, Moore started to dismiss his own law and in an interview said "the nature of exponentials is that you push them out and eventually disaster happens." But at the same time he said the timeframe for its breakdown was 10 to 20 years, at which point fundamental limits would have been reached.


Related People

Advertise Here
Advertise your products or services directly

Advertise Here
Advertise your products or services directly



We want to hear from you. If you have any comments or suggestions about this page, please send us your feedback.