Highlights from Chip War by Chris Miller

Cover of Chip War

Highlights from this book

  • It was popular to interpret the decline of GCA as an allegory about Japan’s rise and America’s fall. Some analysts saw evidence of a broader manufacturing decay that started in steel, then afflicted cars, and was now spreading to high-tech industries. In 1987, Nobel Prize−winning MIT economist Robert Solow, who pioneered the study of productivity and economic growth, argued that the chip industry suffered from an “unstable structure,” with employees job hopping between firms and companies declining to invest in their workers. Prominent economist Robert Reich lamented the “paper entrepreneurialism” in Silicon Valley, which he thought focused too much on the search for prestige and affluence rather than technical advances.

  • The USSR’s “copy it” strategy had actually benefitted the United States, guaranteeing the Soviets faced a continued technological lag. In 1985, the CIA conducted a study of Soviet microprocessors and found that the USSR produced replicas of Intel and Motorola chips like clockwork. They were always half a decade behind.

  • The Persian Gulf War was the first major test of Perry’s

  • As Bill Perry watched the Persian Gulf War unfold, he knew laser-guided bombs were just one of dozens of military systems that had been revolutionized by integrated circuits, enabling better surveillance, communication, and computing power. The Persian Gulf War was the first major test of Perry’s “offset strategy,” which had been devised after the Vietnam War but never deployed in a sizeable battle. In the years after Vietnam, the U.S. military had talked about its new capabilities, but many people didn’t take them seriously. Military leaders like General William Westmoreland, who commanded American forces in Vietnam, promised that future battlefields would be automated. But the Vietnam War had gone disastrously despite America’s wide technological advantage over the North Vietnamese.

  • To many people in Silicon Valley, Sanders’s romantic attachment to fabs seemed as out of touch as his macho swagger. The new class of CEOs who took over America’s semiconductor firms in the 2000s and 2010s tended to speak the language of MBAs as well as PhDs, chatting casually about capex and margins with Wall Street analysts on quarterly earnings calls. By most measures this new generation of executive talent was far more professional than the chemists and physicists who’d built Silicon Valley. But they often seemed stale in comparison to the giants who preceded them. An era of wild wagers on impossible technologies was being superseded by something more organized, professionalized, and rationalized. Bet-the-house gambles were replaced by calculated risk management. It was hard to escape the sense that something was lost in the process.

  • Since his earliest days at Apple, Steve Jobs had thought deeply about the relationship between software and hardware. In 1980, when his hair nearly reached his shoulders and his mustache covered his upper lip, Jobs gave a lecture that asked, “What is software?” “The only thing I can think of,” he answered, “is software is something that is changing too rapidly, or you don’t exactly know what you want yet, or you didn’t have time to get it into hardware.”