Features

July/August 2012 The Hole in the Bucket

Americans obsessed over personal finance during the last forty years as never before. So how come so many of us wound up broke? Here's the little-known story.

By Phillip Longman

The near majority of workers lacked access to 401(k)s, and only about a third of those whose employers offered the plans participated. This often meant, of course, that workers forfeited the free money offered through employer matching grants. In addition, the workers who joined were often clueless about how to manage their portfolios, alternatively putting their nest eggs into money-market accounts that didn’t keep up with inflation, or speculating in tech stocks.

Also in line with human nature, many also simply failed to save enough. In the years when the stock market was routinely returning 15 to 20 percent, it was easy to believe the hype and conclude that only a small amount of savings would be required to meet your retirement goal. Still more people would cash out their 401(k)s every time they changed jobs or had a short-term financial emergency. Others saw their retirement savings cut in half or worse by divorce settlements.

Meanwhile, the percentage of the population that took advantage of Individual Retirement Accounts was even lower. Partly this was due to inertia or lack of sophistication. But it was also due to the fact that people with little or no income could gain little or no tax relief by investing in an IRA or other defined contribution plan. As the system came to operate, the federal government spent billions of dollars offering tax subsidies to rich people who sheltered their income in retirement accounts while offering virtually nothing to those who most needed to save. As such, the whole system became pure “Robin Hood in Reverse”—which, of course, didn’t much bother Republicans, and many Democrats went along as well.

Another big problem, easy to see in hindsight, was that while we were telling ordinary Americans to trust their savings to the sharks on Wall Street, we also were telling the sharks that there were no more rules. By the end of the 2000s, millions of humble Americans had invested their savings in mortgage-backed securities, believing their broker’s assurances, or that of some raving head on cable, that these were safe. Many who bought even garden-variety mutual funds saw their returns eroded by hidden fees.

In the meantime, financial markets devolved to the point where there were no longer many safe ways to invest that would keep up with inflation, so people took on more and more risk as they chased higher yields, often having no idea what they were doing. Many turned to investing in their houses or buying bigger ones. Many also, in effect, wound up using credit card debt to finance their stock market and real estate speculations—as when, for example, people put money into mutual funds or an upgraded kitchen and then found they needed to use their credit cards to cover routine household expenses, like gas, groceries, or the rising cost of health insurance.

This brings us to the other side of the deteriorating balance sheet of most Americans, and one that was just as consequential in bringing us to where we are today: debt. It’s bad enough that American society blew the plan that was supposed to get Baby Boomers and younger Americans saving much more for their retirement: at the same time, we exposed the same population to an epidemic of predatory lending.

It’s almost impossible to exaggerate the drama of this story. To put it in some historical perspective, for as long as there has been credit flowing in human history—going back at least as far as the code of Hammurabi, circa 1750 BC—there have been laws to prevent usury. The Old Testament tells of the Prophet Ezekiel, who included usury in a list of “abominable things,” along with rape, murder, robbery, and idolatry. Roman law capped interest rates at 12 percent. According to the Qur’an, “Those who charge usury are in the same position as those controlled by the devil’s influence.” Dante condemned usurers to the seventh circle of hell, along with blasphemers and sodomites. Martin Luther argued that any interest rate above 8 percent was generally immoral, and the Puritans who settled the Massachusetts Bay Colony agreed, adopting America’s first usury law 150 years before the ratification of the Constitution.

Most of America’s Founding Fathers thought them right to do so. Notes law professor Christopher L. Peterson, “Throughout the history of the American Republic, all but a small minority of states have capped interest rates on loans to consumers with usury law.” In the Progressive Era, reformers pushed a Uniform Small Loan Law that capped interest rates at 36 percent, and limited them to specially licensed lenders adhering to strict standards of lending. As late as 1979, all states had laws of some sort that capped interest rates.

This short history of usury laws puts into perspective just how bizarre the credit markets of the United States have become over the last forty years. Usury law is, in the words of one financial historian, “the oldest continuous form of commercial regulation,” dating back to the earliest recorded civilizations. Yet starting in the late 1970s, some powerful people decided we could live without it.

First to go were state usury laws governing credit cards. Before 1978, thirty-seven states had usury laws that capped fees and interest rates on credit cards, usually at less than 18 percent. But in 1978 the Supreme Court, in a fateful decision, ruled that usury caps applied only in the state where the banks had their corporate headquarters, instead of in the states where their customers actually lived. Banks quickly set up their corporate headquarters in states that had no usury laws, like South Dakota and Delaware, and thus were completely free to charge whatever interest rates and fees they wanted. Meanwhile, states eager to hold on to the banks headquartered within their borders promptly eliminated their usury laws as well.

Later, in 1996, the Supreme Court handed usurers another stunning victory. In Smiley v. Citibank it ruled that credit card fees, too, would be regulated by the banks’ home states. You might think that market forces would set some limits on how high credit card fees and interest can go—after all, there are only so many creditworthy borrowers, and much competition for their business. But with shrewd use of “securitized” debt instruments and hidden fees, banks and other lenders found they could make more money from those who could not afford credit cards than from those who could.

And this was only the beginning. By the early 2000s, thanks to the combination of deregulation and “financial engineering” on Wall Street, middle- and lower-class neighborhoods across America were being flooded with what could be called financial crack. In the years between 2000 and 2003 alone, the number of payday lenders more than doubled, to over 20,000. Nationwide, the number of payday lender franchisees rivaled that of Starbucks and McDonald’s combined.

Phillip Longman is a senior editor at the Washington Monthly and a lecturer at Johns Hopkins University, where he teaches health care policy. He is also a senior fellow at the New America Foundation, where Atul Gawande is a board member.

Comments

  • Darsan54 on July 11, 2012 12:28 PM:

    I walk away with the feeling it's all hopeless.

  • DAFlinchum on July 11, 2012 1:50 PM:

    When 401(k)'s first came out I thought that they were a Trojan Horse that would lull workers into thinking that it would be an addition to their defined benefit pension plans and at the same time signal to employers/companies that 401(k)'s would be a fine substitute for defined benefit pension plans. As it happens I was correct.

    First a few companies, then more, and more still as the companies that tried to do best by their employees discovered that it was hard to compete with those companies that went defined contribution. The employees who had been happy to have their companies do a small amount of matching funds soon found that this would be their company's only contribution in the future. Long term employees saw their defined benefit pensions frozen; new employees got 401(k) only.

    I now predict that the same thing will happen with the ACA and company provided health care. First the employee portions of fees, copays, premiums, etc will start rising until the employees' costs are nearly as much as if they went to the exchanges. Next, companies start off-loading employees to the exchanges and paying the cheaper penalty. The employees see little difference as their costs have risen so much and the federal government will shoulder more and more costs as it subsidizes families on the exchanges. Where will this money come from? Higher taxes on 'the rich'? Don't bet the farm.

    It is going to prove to be a boon to the biz interests from insurance companies on down as companies shift the costs of doing business, which used to be health care benefits for employees, onto the government and community at large and pocket the extra profits.

    The ACA will not be so 'affordable' and will instead be the law of unintended consequences.

  • Daniel on July 11, 2012 9:55 PM:

    What starts with an F, ends with a K, and means get screwed?

  • BenefitJack on July 12, 2012 6:02 PM:

    The author states: "Most Americans approaching retirement age do not have a 401(k) or other retirement account. Among the minority who do, the median balance in 2009 was just $69,127."

    Two issues with how that is positioned in the article:

    First, he is absolutely correct that the LACK OF a retirement savings plan is one of the main reasons why workers will not be prepared for retirement. The four top reasons are:
    (1) Most individuals do not have access to a plan at work,
    (2) Most individuals who do have access to a plan at work fail to enroll when first eligible,
    (3) When individuals get around to enrolling to save for retirement, they typically don't contribute enough; many fail to contribute enough to get the entire company matching contribution, and finally,
    (4) The typical American changes employers 5, 6, 7 times in a working career, and too frequently, she takes a distribution of any such savings and uses them to pay accumulated debts (or to splurge).

    Second, the $69,127 number is misleading positioned as it is here too close to the statement "as they approach retirement age". One Employee Benefits Research Institute study of millions of participants confirms that the average account balance in 401(k) plans as of December 31, 2010 was ~$60,000, however, workers who had 30 years service with the same employer and are over age 60 had an average account balance of ~$202,000.

    The implicit suggestion that many workers or even a majority of workers once had access to a well-funded, non-contributory, final average pay, defined benefit pension plan, perhaps with an automatic post retirement COLA is just fantasy.

    In terms of a precursor for how employers will respond to health reform, look no further than what has happened to employer-sponsored coverage over the last ten years, or, what happened to retiree health care coverage over the past 30 years.

  • Rugosa on July 12, 2012 7:00 PM:

    woa - you got me at had all Americans been required to save for retirement Did you not read what you wrote a few paragraphs up: By the end of the 1970s, the wages of most young people paying into the system were stagnating.

    How can you require us to save for retirement when our incomes increasingly can't keep up with the cost of living?

  • Lex on July 13, 2012 9:47 AM:

    Interesting, though, isn't it, how just about all Fortune 500 CEOs have defined-benefit pension plans AS WELL AS other retirement savings. Not to mention golden parachutes: The guy who was pushed out as CEO of Duke Energy a few days ago after serving in the position all of about 20 minutes (literally) gets an exit package valued at north of $40 million, financed, in all likelihood, by stockholders, ratepayers or both. That's just farking nuts and ought to be illegal.

  • Bill Bush on July 13, 2012 3:58 PM:

    Submit to your Galtian Overlords! Die immediately after retirement, leaving your entire estate to pay off the federal deficit! Kill the wounded! Cull the hospitals with a cost analysis tool favoring soonest future profitability!

    Or abandon the dog-eat-dog mentality and the current vicious variant of capitalism. Try doing things humanely, responsibly and kindly. Unless you think the first paragraph sounds like fun.

  • leo from chicago on July 13, 2012 5:47 PM:

    "How can you require us to save for retirement when our incomes increasingly can't keep up with the cost of living?"

    Exactly. I love how they tell you you can save up to such and such a percentage (tax free!) in 401k accounts and IRA's -- when you're living paycheck-to-paycheck.

  • brian t. raven on July 15, 2012 4:05 PM:

    1. We could close the door to immigrants,
    2. We could impose high tariffs on imports.
    3. We could make it easier for workers to unionize.
    4. We could even raise the income tax rate on the top 1 percent to 100 percent.

    It's a very good and informative article except for the four recommendations above. From the perspective of basic economics these are fundamentally flawed. The Monthly needs to send these kinds of articles to a venerable economist before publishing. Even if the recommendations were merely musings they still should have been excised; because they diminish the argument by undermining author credibility.

  • skeptonomist on July 20, 2012 8:03 PM:

    Generally good piece explaining the role of the finance industry in shrinking the assets of the 99%. The housing bubble was directly responsible for most of the loss since 2002, and it probably would not have been possible without credit-default swaps, which gave a false sense of security to all the bad mortgage debt. There is little sign that the dangers of derivatives have been restrained.