In Memoriam – Gordon Moore, who put the more in “Moore’s Law”

Security

Gordon Moore, co-founder of Intel, has died at 94.

Academically, Moore was both a chemist and physicist, earning a Bachelor’s degree in chemistry from the University of California at Berkeley in 1950, and a Doctorate in physical chemistry and physics from the California Institute of Technology in 1954.

After a brief interlude as a researcher at Johns Hopkins University in Maryland, Moore returned to his native San Francisco in 1956 to work for the co-inventor of the transistor, William Shockley, at the startup Shockley Semicondutor Laboratory in Mountain View.

Although Shockley has been described by Jacques Beaudoin of Stanford University as “the man who brought silicon to Silicon Valley”, he was a controversial figure even in his own heyday (to be blunt, he was an unreconstructed racist), and was by many accounts an abrasive, divisive, perhaps even paranoid manager.

By 1957, Moore and seven other Shockley Semiconductor staffers had had enough of Shockley, and decided to break away to form their own startup instead, with what’s known these days as venture capital injected by a cash-rich East Coast camera company, Fairchild Camera and Instrument.

Startup breakways may be routine in the technology industry these days, but they weren’t common at all in the 1950s, and Moore and his fellow entrepreneurs went down in history under the dramatic nickname of “The Traitorous Eight”.

The company that the Traitorous Eight founded, Fairchild Semiconductor, was quickly successful, and is officially recognised by the State of California as the producer of the “first commercially practicable integrated circuit.”

Based on patents granted and overturned over the years, credit for actually inventing the integrated circuit see-sawed between Jack Kilby of Texas Instruments, and Robert Noyce of Fairchild, with both of them ultimately acknowleged as joint inventors. Sadly, by the time Jack Kilby was recognised with a Nobel Prize in Physics in 2000, Noyce had been dead for a decade, and Nobel prizes can’t be given posthumously, so Kilby received the award on his own.

What took you so long?

By 1968, Moore was ready for another breakaway, and he and Robert Noyce left Fairchild to form a new startup of their own, along with deal-maker Arthur Rock .

Rock, originally from New York, helped the Traitorous Eight get their seed money from Fairchild Camera and Instrument in the 1950s; he had moved to San Francisco in the early 1960s to go into hi-tech adventure capitalism (appaently, venture capital had a more exciting name in those days).

According to Walter Isaacson, writing in his book The Innovators, when Noyce called Arthur Rock in 1968 to ask for help attracting backers for company that he and Moore wanted to create, Rock replied with a single question: “What took you so long?”

Apparently, Moore and Noyce toyed with the precise but unadventurous company name Moore Noyce, but soon realised that when said aloud, it was easily confused with “more noise”, an undesirable attribute in electronic circuits.

They incorporated, it seems, as NM Electronics, but quickly switched to Integrated Electronics.

Integrated Electronics was in turn a short-lived name, with the company soon known by the shortened form it has retained to this day: Intel.

Moore’s Law revisited

Ironically, perhaps, Moore, is probably most widely known today not for the entreprenurial enthusiasm, engineering excellence and business acumen that he brought to Intel during his long and storied career…

…but for a brief article that was published in Electronics magazine in April 1965, three years before he started Intel with Robert Noyce.

The article was enthusiastically entitled Cramming More Components onto Integrated Circuits, and its third sentence is preternatually prescient (remember, this was written almost 60 years ago):

Integrated circuits will lead to such wonders as home computers – or at least terminals connected to a central computer, automatic controls for automobiles, and per-
sonal portable communications equipment.

Intriguingly, we now live in a cloud-centric computer ecosystem in which, for many of us, our most expensive single piece of personal computing equipment is neither a laptop for offline work, nor a terminal for hooking up to a powerful central “mainframe” computer service, nor a two-way radio for keeping in touch from afar…

…but a device that we still anachronistically refer to as a “mobile phone” that does all of these things, and much, much more. (No pun intended.)

Two famous graphs

Moore presented two simple graphs in his article.

The first, and perhaps the more important of the two, suggested that the only way to keep improving the performance of an integrated circuit would be to keep making the individual components in the circuit smaller.

You couldn’t merely keep making the chip itself bigger to give you more room for components.

Moore suggested, perhaps counterintuively to many readers at the the time, that given the same manufacturing process with the same component size, reliability falls (and thus cost starts increasing) as you try to integrate more components into a finished chip:

The lowest point of eeach U-shaped curve denotes the component count “sweet spot” for each manufacturing process. (The 1970 curve is a prediction, given that the graph was published in 1965.)

In other words, it’s not enough to add more components to a chip just by using more space, because you soon reach a natural limit imposed by the manufacturing process itself.

As the title of the article suggests, you need to change the process as well, so you can quite literally cram in the extra components you need, rather than simply letting them spread out around the edges.

The second graph in the article is the one for which Moore is probably best remembered, even though it has just four true data points on it.

Moore suggested that this price-performance sweet spot, based on the ongoing miniaturisation of component sizes, had increased exponentially from 1962 to 1965.

In other words, if you plotted a graph with a linear scale on the X-axis (time) and a logarithmic scale on the Y-axis (number of components in chip, which we today loosely refer to as transistor count), you’d get a straight line.

The 20 = 1 value for 1959, which happens to line up fairly nicely, denotes that altough the company had invented a process for making integrated circuits at that point, the products it had to sell were all still individual, standalone transitors, each with a component count of one:

Looking ahead 10 years, Moore therefore conjectured that by 1975, we might reasonably expect chips with 216 components (about 65,0000) baked into them – an astonishing acceleration in potential computer power.

Not quite, but nearly so!

In real life, things didn’t quite turn out that way.

Intel’s own 8086 microprocessor, for example, released in 1978, had a transistor count of just under 30,000, close to 215, but Moore’s original prediction was for chips to accommodate 219 components by then, or more than half a million.

Indeed, by 1975, Moore had adjusted his estimate to a doubling of component counts every two years, rather than every year, along with the necessary reduction in size of each component in the integrated circuit.

That prediction of exponential growth became known as Moore’s Law, and although it isn’t in any literal sense a law, and although we haven’t quite kept up with it in the way he predicted…

…we’ve come surprisingly close.

The mark of a Mage

Although it’s not really comparing like with like, let’s line up a 1978-era Intel 8086 microprocessor against a 2022-era Apple M2 system-on-chip.

The M2 arrived 44 years after the 8086, which is time for 22 two-year doublings, as Moore’s Revised Law of 1975 would predict.

That would take the M2’s theoretical component count from 215 to 215+22 = 237, or just under 140 billion.

The M2 takes up 150mm2 – that’s what’s known as its die size, the actual dimensions of the silicon chip inside the package that’s soldered to your new Mac’s motherboard.

Amazingly, that’s less than five times larger than the 8086, which was a more modest 33mm2, but the M2 die has a component count of about 20 billion, or just over 234.

That might not be exactly what the Revised Law of 1975 predicted, but it’s hard to quibble with such a modest difference over such a long time.

Usually, when a technology commentator tells you that something “is growing exponentially” – whether that’s the hacking abilities of cybercriminals, the value of a new cryptocoin, or whatever they’re interested in talking up at the time – you know to treat their remarks as mere marketing metaphor.

True exponential growth is usually short-lived simply because you quickly run out of resources to keep up the regular doubling, so any growth that’s described “exponential” is almost always either a flash in the pan, or plain old hype.

It is therefore a mark of Gordon Moore’s insight, importance, innovation, intellect and influence that when he predicted transistor counts would grow as he did, almost 60 years ago, in what was published as a brief piece in a popular magazine…

…his words were hailed as a Law, though in truth it was as much as case of The Moore Effect – a challenge as much as a calculation; a proposal as much as a prediction; an exhortation as much as an estimate.

Gordon Earle Moore, RIP.


Picture of Gordon Moore in featured image from a memorial collection provided courtesy of Intel Corporation.


Products You May Like

Articles You May Like

Italy’s Data Protection Watchdog Issues €15m Fine to OpenAI Over ChatGPT Probe
HubPhish Exploits HubSpot Tools to Target 20,000 European Users for Credential Theft
US Government Issues Cloud Security Requirements for Federal Agencies
Attackers Exploit Microsoft Teams and AnyDesk to Deploy DarkGate Malware
Ukraine’s Security Service Probes GRU-Linked Cyber-Attack on State Registers

Leave a Reply

Your email address will not be published. Required fields are marked *