Integrated circuits are huge and tiny

Moore's Law has proved to be remarkably accurate



TWO SCORE YEARS AGO, A YOUNG SILICON Valley executive observed in the pages of an industry journal that the integrated circuit eventually would become cheap enough to be embedded in our daily lives.

The term "Silicon Valley" hadn't yet been coined (that wouldn't happen until 1971). Integrated circuits, or chips, were so expensive they were considered suitable only for multimillion-dollar projects, like Apollo moon missions. The home computer wouldn't appear for nearly 20 years.

Yet Gordon E. Moore, then the research and development director of Fairchild Semiconductor, which he had co-founded, foresaw in 1965 that silicon chips were going to plummet in price. He even tried to approximate the rate, projecting a drop of 50 percent a year for the next 10 years.

"I never had any idea it was going to be at all precise," he said last month of his prediction. Yet it was accurate enough to have since gained the stature of holy writ: Moore's Law.

Moore's Law has become one of the most quoted insights in science and industry. Stripped of its technical details, it amounts to an expression of faith in the march of technology - the inexorable trend in electronics toward the smaller, the cheaper, the more powerful.

"Moore's Law is an example of a tangible belief in the future - that the future isn't just a rosy glow," says Carver Mead, Caltech's emeritus Betty and Gordon Moore Professor of Engineering and Applied Science, who joined Moore in a public discussion celebrating the law's 40th anniversary. (Mead, who is generally credited with the term "Moore's Law," believes he first so described Moore's insight during a casual chat with a writer many years ago. "It certainly wasn't anything premeditated," he told me.)

Unusual for a senior statesman of electronics, Moore was trained as a chemist. Armed with a doctorate from Caltech, he was recruited by William Shockley, soon to win the Nobel Prize as co-inventor of the transistor, to join his pioneering Palo Alto electronics company in 1956. How new was the field? "Hardly anybody knew anything about semiconductors," Moore related, "especially silicon."

Shockley's paranoid management style soon drove Moore and seven other executives to quit. They formed Fairchild, earning the lifelong enmity of Shockley, who labeled them "the traitorous eight." In 1968, along with Robert Noyce, another member of the original renegade group, Moore left Fairchild and co-founded what would become semiconductor giant Intel Corp.

At 76, Moore projects a wry, self-effacing sense of humor about his distinguished career. The hearing aids he wears as a concession to age, he says, are the consequence of his youthful infatuation with chemical experimentation in his parents' Redwood City home. A drop of homemade nitroglycerin placed on an anvil and slammed with a hammer, he had discovered, "made an absolutely superior firecracker."

Moore's Law first appeared in an article he published in the journal Electronics on April 19, 1965. Inelegantly titled "Cramming more components onto integrated circuits," the piece observed that (technically speaking) the complexity of the most economical chips being manufactured had approximately doubled every year for the prior four years, and that their cost per component had fallen at almost the same rate. Projecting the timeline out another decade, he estimated that by 1975 the model chip would hold 65,000 transistors on a wafer of silicon a quarter-inch square.

He was not off by much; there would be nine doublings of complexity in that 10-year span. Subsequently, however, the pace slowed, largely because chip designers had already exploited the easiest miniaturization strategies. Since 1975, the doubling has come every 21 months, on average. Even at that pace, the power of electronics has exploded. A version of Intel's Itanium 2 chip introduced last year holds 592 million transistors. Its coming Montecito chip will comprise 1.72 billion.

Moore is careful to explain that he was not intending to draw a technological roadmap in 1965, but merely arguing that silicon chips would have more applications than anyone anticipated. "People thought integrated circuits were very expensive, that only the military could afford the technology. But I could see that things were beginning to change. This was going to be the way to make inexpensive electronics."

His foresight was exemplary; the 1965 article predicted that the drop in cost would "lead to such wonders as home computers ... automatic controls for automobiles, and personal portable communications equipment. The electronic wristwatch needs only a display to be feasible today."

Baltimore Sun Articles
Please note the green-lined linked article text has been applied commercially without any involvement from our newsroom editors, reporters or any other editorial staff.