Cover of Chip War by Chris Miller - Business and Economics Book

From "Chip War"

Author: Chris Miller
Publisher: Simon and Schuster
Year: 2022
Category: Business & Economics

🎧 Free Preview Complete

You've listened to your free 10-minute preview.
Sign up free to continue listening to the full summary.

🎧 Listen to Summary

Free 10-min Preview
0:00
Speed:
10:00 free remaining
Chapter 2: Part II: THE CIRCUITY OF THE AMERICAN WORLD
Key Insight 5 from this chapter

Intel's Revolution and the Dawn of the Digital Era

Key Insight

In 1968, Robert Noyce and Gordon Moore left Fairchild Semiconductor to found Intel (Integrated Electronics), driven by dissatisfaction with stock options and corporate interference. Their vision was to make transistors the cheapest product ever, with trillions consumed globally, empowering humanity while making it fundamentally dependent on semiconductors. Intel's first product, launched two years later, was the dynamic random access memory (DRAM) chip. This innovation replaced magnetic cores—matrices of tiny metal rings strung with wires—which were becoming too complex and impossible to hand-assemble as demand for computer memory exploded. IBM engineer Robert Dennard conceived DRAM, coupling a transistor with a capacitor to store 1s and 0s, with transistors repeatedly charging the leaky capacitors.

DRAM chips, unlike magnetic cores, were carved into silicon, eliminating the need for manual weaving, reducing malfunctions, and allowing for much smaller sizes. Intel initially aimed to dominate the memory chip market due to its mass-producibility and lack of specialization, which allowed for economies of scale. However, a request from the Japanese calculator firm Busicom in 1969 presented a new challenge. Ted Hoff, an Intel engineer with a background in computer architectures, realized that instead of designing twelve specialized chips with 24000 transistors for the calculator, a single, standardized logic chip could be coupled with powerful memory and programmed with different software to compute various tasks. This insight was revolutionary given Intel's advancements in memory chips.

In 1971, Intel launched the 4004, advertising it as the world's first microprocessor—a 'micro-programmable computer on a chip.' This generalized logic chip could be used in many devices, initiating a computing revolution where general logic became mass-producible. Carver Mead, a Caltech professor and consultant for Intel, recognized the profound implications, coining the term 'Moore's Law' to describe the exponential increase in transistor density. In 1972, Mead predicted that 'every facet of our society will be automated to some degree' within a decade, envisioning 'a tiny computer deep down inside of our telephone, or our washing machine, or our car.' He calculated a '1000000 to 10000000' increase in data processing and retrieval rate in the preceding 20 years, proclaiming an era of 'computer power coming out of our ears.' Intel's leaders, including Gordon Moore, saw themselves as the true 'revolutionaries' of the 1970s, ushering in a digital world where influence would accrue to those who could produce and manipulate computing power with software.

📚 Continue Your Learning Journey — No Payment Required

Access the complete Chip War summary with audio narration, key takeaways, and actionable insights from Chris Miller.