The answer to America’s techno-malaise is to force big corporations to compete more. And to open their patent vaults.
Under Arnold’s leadership, in 1941 the DOJ and the Federal Trade Commission (FTC) began to apply a variant of this policy. The government’s general approach was to start by bringing an antitrust suit against a firm that had captured undue control of some sector of the economy. It would then accept a settlement (in the form of a consent decree) by which the corporation promised to share its basic technologies with all comers, for free.
Until the Ronald Reagan administration killed the policy, the U.S. government applied this model to most technologically dominant large corporations in the nation. In the process they forced the people who controlled these companies to spill perhaps upward of 100,000 technological “source codes” into the world. A study done in 1961 counted 107 judgments just between 1941 and 1959, which resulted in the compulsory licensing of 40,000 to 50,000 patents.
One result was the greatest dissemination of industrial knowledge in human history. The world was treated to the secrets behind the televisions of RCA, the light bulbs of General Electric, the cellophane and nylon of DuPont, the titanium of National Lead, and the shoemaking technologies of United Shoe Machinery, among many others.
Another result was a new balance of power in the political economy of technology. By using antitrust law to trump patent law, Arnold and his team largely solved the traditional dilemmas of patent law. The big companies were less free to use patents to protect their bastions. The small firms, precisely because their size exempted them from antitrust oversight, could still fully exploit patent monopoly. Without breaking a single big industrial company, Arnold and his team helped foster a world in which engineers and scientists—no matter how small the company they worked for—could go about their work safe from predation, albeit not competition.
To get a sense of how Arnold’s team liberated the ingenuity of America’s citizens, consider what took place after the DOJ brought an antitrust suit against AT&T in 1949. By the late 1940s, AT&T had become notorious for its failure to integrate the most recent ideas of its subsidiary, Bell Labs, into the telephone system it controlled. The FTC, for instance, had cited the monopoly for sitting on such ready-for-market innovations as automatic dialing, office switchboards, and new handsets.
Even before settling the case, AT&T began licensing out key patents it controlled. One was for an obscure device called the “electronic transistor.” At the time, transistors were seen merely as a potential competitor to existing vacuum tube technology, and AT&T wasn’t much interested in disrupting its existing business lines by developing them. In 1952, AT&T licensed the technology for a small fee to thirty-five companies, twenty-five from the United States and ten from abroad.
Today, of course, transistors are the bedrock of all computer technology. The path to practical application was blazed not by AT&T or any other big firm; as business historian David Mowery has written, “the more aggressive pioneers in the application of the new transistor technology were smaller firms that had not produced vacuum tubes.” One of the smallest, Texas Instruments, introduced the first commercially viable transistor in 1954, just three years after its founding. Other early drivers were Motorola and Fairchild.
Consider, also, what happened inside the big, science-based industrial corporations after they were forced to compete with the fruits of their own scientists’ labors. In his close study of DuPont, business historian David Hounshell writes that “a particularly virulent attack” by the DOJ in the 1940s led executives to conclude that DuPont’s “generation-old strategy of growth through acquisition was no longer politically feasible,” and, further, “that the corporation’s growth would have to be based almost exclusively on the fruits of research.” Pointing to DuPont’s subsequent investments in R&D, Hounshell concluded that Arnold’s policy, although not necessarily best for DuPont’s short-term profits, “was good for the scientific community” at large.
We see much the same pattern in copier technology. Here the key action was a 1975 consent decree between the FTC and Xerox. In 1972, Xerox had been able to use patents to block Litton and IBM from entering the plain paper copier market. But the new agreement opened the market to new competitors and spurred Xerox to redouble its own development efforts. “The transition period” after the consent decree, Stanford economist Timothy Bresnahan has written, “saw a great deal of innovative activity from entrants and Xerox.” Faced with new competitors on all sides, he adds, “Xerox introduced new products in all segments.”
We also see this pattern in the software industry. In January 1969 the DOJ filed suit against IBM, charging the giant with retarding the growth of data-processing companies. In direct response to the suit, IBM decided to “unbundle” its hardware, software, and services. As then CEO Thomas Watson Jr. wrote, to “mollify” the Justice Department IBM abandoned its old marketing practice, by which it would “lump everything together in a single price—hardware, software, engineering help, maintenance, and even training sessions.”
One result, as Alfred Chandler observed, was to open up a market for “companies [including the Computer Sciences Corporation and Applied Data Research] hoping to sell independent software applications.” The other was to spur IBM to new and greater feats of science and engineering. In the years after the suit, Watson writes, IBM “prospered—which made the antitrust laws easier for me to accept.”
Now consider, in contrast, what happened within the walls of the giant science-based industrial corporations after the Reagan administration abandoned most of Arnold’s approach to regulating competition. We see a sudden collapse of investment by giant firms left to govern entire realms of technology as they alone determined.
In the 1980s and ’90s, General Electric was run by Jack Welch, widely recognized as one of the brightest CEOs of the time. Almost as soon as the Reagan administration overturned Arnold’s antitrust regime, Welch embarked on what he called his “No. 1 and No. 2 strategy.” First was a campaign of buying up and selling off business units in order to insulate GE from competition in every industrial sector in which it operated. Second was a shift from a reliance on R&D to drive profitability to a reliance on exploiting Welch’s newly forged corporate power. The bottom line? In 1981, GE was the fourth-biggest U.S. industrial firm and one of the top spenders on research. By 1993, GE had fallen to seventeenth in spending on R&D but had become the most profitable big company in America.
For a more recent example, there’s Pfizer. Here the buying binge did not begin until 1999, but once it started executives pursued it with abandon. Over the next ten years they grabbed Warner-Lambert, Pharmacia, and many smaller companies. The culmination came in 2009 when they seized Wyeth. The executives cut 19,000 jobs. They also cut R&D by a phenomenal 40 percent, from $11.3 billion at the two companies to about $6.5 billion. The former president of Pfizer Global Research, John LaMattina, summed up the results in Nature. “Although mergers and acquisitions in the pharmaceutical industry might have had a reasonable short-term business rationale,” LaMattina wrote, “their impact on the R&D of the organizations involved has been devastating.”
Feed the Political AnimalDonate
Washington Monthly depends on donations from readers like you.