Pages

21 October 2022

How Silicon Valley Lost the Chips Race Money Alone Won’t Revive the U.S. Semiconductor Industry

Chris Miller

Thanks to the CHIPS and Science Act, signed into law in August 2022, the U.S. government has $52 billion in new funding to try to reinvigorate the country’s semiconductor industry. Although semiconductors were invented in the United States and their design and manufacture still generally require U.S. software and tools, most chips are now manufactured elsewhere, largely in East Asia. But in the face of escalating U.S.-China competition—and with Washington rolling out sweeping new restrictions on China’s access to advanced computing technologies—improving the U.S. position in chip-making has become a national security priority. That the most advanced processors can be fabricated only outside the United States, mainly in Taiwan, adds to the sense of risk.

Rebuilding the U.S. role in manufacturing will be expensive, as the CHIPS and Science Act recognizes. TSMC, Samsung, and Intel—the three biggest companies producing processor chips—are all likely to receive funds for new semiconductor manufacturing facilities in the United States. But an influx of new money alone can’t solve the problem at the core. A cultural change is needed, too, in Silicon Valley and in Washington, to prioritize the challenges faced by firms in advanced manufacturing, including chip makers and their key suppliers.

Silicon Valley has strayed too far from its manufacturing roots, focusing on apps and the Internet, while policymakers in Washington are fixated on consumer-focused Big Tech firms rather than on the hardware on which all computing depends. As the U.S. government tries to revitalize the semiconductor industry, it will succeed only if it learns lessons from Silicon Valley’s early days. This isn’t the first time U.S. chip firms have faced intense foreign competition amid fears that they were falling behind. In the 1980s, leading U.S. semiconductor manufacturers such as Intel stood on the brink of bankruptcy. Intel was rescued by CEO Andrew Grove, who was driven by the realization that advanced technology depends not only on creativity and innovation but also on ultra-efficient precision manufacturing. To help the U.S. chip industry, policymakers in Washington need to start by adjusting their definition of “tech” to encompass advanced manufacturing, too.

HOW DID WE GET HERE?

The United States’ long decline in the production of processor chips has a complex set of causes. Multiple factors have driven up the cost of chip-making in the United States relative to other countries. Environmental regulations governing the toxic chemicals involved in chip-making have grown stricter in the United States. U.S. labor costs are higher than those in parts of East Asia, although labor represents a smaller share of the cost of fabricating semiconductors than it does of many other types of manufacturing.

Most important, however, is that other governments have offered substantial tax incentives for chip-making that the United States, until recently, has failed to match. China’s surge of subsidies available through its Made in China 2025 program and other government initiatives represents the latest step in a semiconductor subsidy arms race that has nothing to do with market competition.

Meanwhile, the chip industry has undergone relentless consolidation. Several decades ago, two dozen companies could fabricate advanced processor chips, but today only three firms produce the most advanced processors: Taiwan’s TSMC, South Korea’s Samsung, and the United States’ Intel. Each company keeps most of its production in its home country. For that reason, the fate of the United States’ domestic chip-making capabilities depends in no small part on the trajectory of a single company: Intel.

Grove, the Budapest-born refugee who led the company for several decades, saw manufacturing as the core of Intel’s identity. After becoming the firm’s president in 1979, he pulled it back from the brink of bankruptcy amid an onslaught of Japanese competition. Within a decade, Intel’s processor chips were inside more than half of all the computers that had ever been built. Since then, the company has earned over $250 billion in profit.

When Grove first became president of Intel, its primary business was selling memory chips used mostly in corporate mainframe computers. But Japanese firms had entered the sector in the mid-1970s, learning to fabricate chips that were less expensive to produce and had fewer defects than chips made by U.S. peers. Watching this, Grove knew the firm had to get out of commoditized memory chips and refocus on higher-value products such as the advanced microprocessors that IBM was putting in a new device it called “the personal computer.”

Exiting the memory-chip market felt impossible to many at Intel, like Ford deciding to stop making cars. But Grove eventually mustered the courage to jettison the memory-chip business, laying off a quarter of Intel’s workforce and shuttering multiple facilities. Alongside this restructuring, Grove adopted a second strategy: ruthlessly improve manufacturing quality. Grove described his philosophy in a bestselling book, Only the Paranoid Survive: “Fear of competition, fear of bankruptcy, fear of being wrong and fear of losing can all be powerful motivators.” After a long day of work, it was fear that kept Grove flipping through his correspondence or on the phone with subordinates, worried he’d missed news of product delays or unhappy customers.

At Grove’s reinvigorated Intel in the late 1980s and 1990s, workdays started at 8 a.m. sharp. A freewheeling Silicon Valley culture was replaced by drill sergeant discipline. Grove launched a new policy called “copy exactly,” by which the company would determine the best manufacturing process and then teach its engineers to replicate it in all their facilities. Many chafed at being told to implement rather than to invent. Yet as each of the company’s plants began to function less like a research lab and more like a finely tuned machine, productivity rose and costs fell.

Grove’s hard-driving management, however, explains only part of Intel’s resurgence during the 1980s. Intel succeeded not only by optimizing manufacturing, although this was crucial, but also by intertwining leading-edge manufacturing with top-notch chip design. Intel called this the “tick-tock” method: each “tick”—a manufacturing process improvement—was coupled with a “tock,” a more efficient chip design. The close interaction between manufacturing, software, and system design kept Intel atop the PC processor business for three decades.

After Grove retired from Intel in 2005, the company began to drift. Because Intel’s core business of building chips for personal computers was so profitable for so long, the company’s culture of discipline began to slip, according to interviews I conducted with former employees. Years of profits dulled the sense of Groveian paranoia that had once permeated the company. Longtime employees noticed that, with each passing year, executives’ shirts got whiter as chemists and physicists lost influence to the finance department. A company that had been an icon of American technology slid into a decadelong decline. After Grove’s departure, the company failed to make big, bold, risky bets. Its chip-making capabilities, which had been the world’s most advanced, fell behind TSMC and South Korea’s Samsung, which are now able to produce chips with more precision than Intel can. The company missed key industry shifts, failing to foresee smartphones and the rise of artificial intelligence. “It had the technology, it had the people, it just didn’t want to take the margin hit,” one former finance executive at Intel told me.

LESS LIKING, MORE BUILDING

After he retired, Grove voiced concern that the United States’ advanced manufacturing capabilities were eroding. “Abandoning today’s ‘commodity’ manufacturing can lock you out of tomorrow’s emerging industry,” he noted in 2010, warning that Silicon Valley’s fixation on software at the expense of hardware was misguided.

Grove saw the electric battery industry as a case study in how losing manufacturing capability risked eroding innovative capacity. The United States “lost its lead in batteries 30 years ago when it stopped making consumer electronics devices,” Grove said in 2010. Back then, American companies had failed to innovate in manufacturing batteries for personal computers; now, they are far behind rivals, notably in South Korea and China, in producing batteries for electric vehicles. “I doubt they will ever catch up,” Grove predicted, with depressing accuracy.

But Grove’s warnings about the importance of advanced manufacturing in the broader “tech” ecosystem were ignored. Most of Silicon Valley wrote him off as representative of a bygone era. After all, he had built Intel before the Internet existed. Facebook, founded in 2006, soon became several times more valuable than Intel, despite manufacturing nothing and selling little besides advertisements. Intel could retort that the Internet’s data was processed on its chips. Yet producing chips was less profitable than selling ads on apps. Over time, however, the United States has lost the ability to manufacture the most advanced processor chips. Chips crucial to applications from smartphones to artificial intelligence in data centers can now only be made offshore.

After a supply chain shock and amid an intensifying rivalry with China, U.S. political and business leaders are beginning to grasp what is at stake. The passage of the CHIPS and Science Act shows that, for the first time in decades, Washington is willing to spend substantial sums of money to support chip makers. This is a crucial first step, but political leaders also need to improve the business environment for manufacturing. Construction permits, environmental rules, and tax policies are critical determinants of the viability of a manufacturing facility. Innovation alone isn’t enough to revitalize U.S. chip-making unless manufacturing chips is economically viable, too.

No comments:

Post a Comment