Features

June/July/August 2014 Thrown Out of Court

How corporations became people you can't sue.

By Lina Khan

Two successive decisions accelerated what might have been a brief and quirky deviation into a major turning point. In 1984, the Supreme Court heard a case brought in California by 7-Eleven franchisees against their parent company, Southland, which had included in their contracts a binding arbitration clause. California outlawed these clauses, recognizing that the franchisees rarely had the power to negotiate these terms. Yet Southland boldly argued that its contract overrode the state’s law. Drawing on Brennan’s unusual interpretation from the previous year—that Congress had intended a “federal policy favoring arbitration”—a 7-2 majority on the Supreme Court ruled for Southland, eroding the power of states to regulate how companies use arbitration.

In a striking dissent, Justice Sandra Day O’Connor, a conservative, berated the majority for ignoring legislative history. “Today’s decision is unfaithful to congressional intent, unnecessary, and … inexplicable,” she wrote. “Although arbitration is a worthy alternative to litigation, today’s exercise in judicial revisionism goes too far.”

It would soon go farther. In 1985, the Supreme Court heard Mitsubishi v. Soler Chrysler-Plymouth, a case in which a car dealer had sued the Japanese manufacturer for violating antitrust laws, and Mitsubishi had pushed to arbitrate. Recalling the Federal Arbitration Act, the car dealer pointed out that companies could only use arbitration to settle contracts they had written, not interpret laws Congress had passed, like the Sherman Antitrust Act. Stunningly, a five-justice majority—riding its recent wave—sided with Mitsubishi. Arbitrators could now rule on actual law—civil rights, labor protections, as well as antitrust—with no accountability or obligation to the public.


Credit: Getty Images

Penning an impassioned dissent, Justice John Paul Stevens warned that there were great hazards in allowing “despotic decision-making,” as he called it, to rule on law like antitrust. “[Arbitration] is simply unacceptable when every error may have devastating consequences for important businesses in our national economy, and may undermine their ability to compete in world markets,” he wrote.

Three years, three decisions: the Supreme Court had drastically enlarged the scope of arbitration. The way the Court split didn’t neatly map onto partisan ideology: liberal justices led the majority in two of the cases, dissented in others; while the conservative arm—which generally preferred to leave arbitration to the states—also jumped around.

Little evidence suggests that Brennan’s analysis followed congressional intent. “There was nothing in the legislative history that says Congress favored arbitration,” says Loyola Law School’s Margaret Moses. “The Supreme Court just stated it and then kept citing itself. It’s spurred a huge policy shift, with no basis in legislation.”

However baffling its reasoning, this drastic shift by the Court followed a decade during which the conservative legal movement had rapidly gained intellectual clout and political power. The infamous 1971 “Powell memo”—a call to arms to corporations written by then corporate lawyer Lewis Powell, who would join the Supreme Court the following year—had galvanized the business community into organizing against liberal groups and consumer activists like Ralph Nader.

This Court’s turn also accorded with a well-financed political campaign for “tort reform,” a conservative cause backed by groups such as the Federalist Society and the Olin Foundation. George H. W. Bush campaigned on tort reform in 1991, while Vice President Dan Quayle headed up the Council on Competitiveness, which held as a central aim eliminating class-action litigation against business. As one scholar of the movement put it, the prevailing belief at the time was that America suffered from “too much law, too many lawyers, courts that take on too much—and an excessive readiness to utilize all of them.”

And true enough, by some measures litigation had increased. In 1962, for example, U.S. district courts conducted just under 6,000 civil trials; by 1981, they conducted more than 11,000. Public figures and the media tended to attribute all of this growth to “frivolous” lawsuits and zealous trial attorneys, but the rise also traced back to other factors, such as the civil rights wins of the 1960s, which meant that laws now protected a much larger segment of the population.

Nonetheless, it became received wisdom in many quarters that America had become an excessively litigious society. Over the 1990s books like The Litigation Explosion: What Happened When America Unleashed the Lawsuit proliferated, setting the culture in which the Court continued to restrict lawsuits and promote arbitration. In 1998 the Chamber of Commerce founded the Institute for Legal Reform, committed to reducing “excessive and frivolous” lawsuits. The Federalist Society convened discussions such as “Is Overlawyering Taking Over Democracy?”

Against the ongoing meme of superfluous litigation, the courts further expanded the realms in which companies could compel arbitration. In the 1995 case Allied Bruce, the Supreme Court approved the use of arbitration clauses by companies in routine consumer contracts. In 2001 the Court ruled against a group of Circuit City workers, holding that employers could use arbitration clauses in contracts with employees. In 2004 a court ruled that arbitration clauses were enforceable against illiterate consumers; another court ruled that they were enforceable even when a blind consumer had no knowledge of the agreement.

Yet the true watershed moment came in 2011, in the case of AT&T Mobility v. Concepcion. Vincent and Liza Concepcion had sued AT&T in California court, charging that the company had engaged in deceptive advertising by falsely claiming that their wireless plan included free cell phones—a practice that had shortchanged millions of consumers out of about $30 each. When they tried to litigate as a class, AT&T pointed to the fine print that prohibited consumers from banding together.

The Concepcions countered that these kinds of class-action bans violated California law as well as that of twenty other states. Moreover, scores of federal judges had forbidden this kind of class-action ban, on the grounds that people often had no practical way to make a claim unless they joined with other plaintiffs in sharing the cost. Allowing companies to wipe away this right in “take-it-or-leave-it” contracts for products like credit cards or phone service would effectively let corporations write themselves a free pass.

The district court and the Ninth Circuit Court of Appeals both supported the Concepcions, ruling that AT&T’s terms were “unconscionable,” a term of art historically used to describe contracts that so favored parties with superior bargaining power as to be unjust. When the case reached the Supreme Court, eight state attorneys general, as well as a legion of civil rights organizations, consumer advocates, employee rights groups, and noted law professors, also weighed in, arguing that allowing these kinds of class bans would enable companies to evade entire realms of law. But the Supreme Court, in a 5-4 split, blessed AT&T’s contract, opening the door for companies to ban class actions routinely in their fine print.

At this point, there was one slender thread of protection left: class-action bans still weren’t enforceable if they eliminated the only way someone could bring a case. But in 2013, the Supreme Court gutted even this provision in a case pitting Italian Colors, a family restaurant in Oakland, California, against American Express. This time around, the same five-judge majority ruled that class-action bans in arbitration contracts were legal—even when they left citizens with no recourse at all.

Lina Khan is a reporter and policy analyst with the Markets, Enterprise and Resiliency Initiative at the New America Foundation.

Comments

(You may use HTML tags for style)

comments powered by Disqus