Features
Last year there wasn’t a single fatal airline accident in the developed world. So why is the U.S. health care system still accidently killing hundreds of thousands? The answer is a lack of transparency.
If the airline industry and its regulators had clung to the same attitude, the average rate of airline fatalities would likely be little better than it was in the 1950s, when flying was at least three times as dangerous, on average, as it is today. It’s only human nature to call average good enough, particularly when what you are doing is difficult. Moreover, when people are engaged in inherently dangerous activities that they believe bring great benefit to society—whether it is serving their country in combat, or moving passengers at 600 miles an hour in and out of the wild blue yonder—it’s understandable that they tend to overlook or dismiss any avoidable harm caused by their actions. Dr. Thomas Lee, an associate editor at the New England Journal of Medicine and a professor at the Harvard School for Public Health, notes how this same process of moral disengagement affects doctors and hospital administrators. They are reticent to acknowledge patient harm, he says, because they’re too busy highlighting the diseases cured and lives saved.
To overcome this natural tendency toward moral disengagement—or what safety experts in other fields call “normalized deviance”—we need in health care what the airline and many other industries already have: a process for systematically recording specific errors and near misses and for making them widely known so that everyone can learn from them. Dr. Peter Pronovost, the safety expert from Johns Hopkins, recommends creating a similarly robust, nationwide system for spotting, measuring, and reporting instances or harbingers of harmful care, with spot audits of medical records to assure compliance. This was also a recommendation of the ground-breaking 1999 “To Err Is Human” report. Following the example of the aviation industry (and of the VA health system, incidentally), this system should also include a process that allows people who witness or commit errors and near misses to report them anonymously.
Public reporting will be bolstered, to a limited degree, under the fine print of Obama’s Affordable Care Act. The new law says that certain injuries and infections that take place in hospitals will be published on Medicare’s Hospital Compare website. Hospitals will also be rewarded or penalized according to how certain readmission rates and hospital-acquired injuries compare to national averages. (As this story was going to press, the Centers for Medicare and Medicaid Services were formulating regulations that go further than any previous efforts, using both carrots and sticks to get hospitals to make care safer.) But here again, the mind set is not zero tolerance of error, but merely a focus on how different hospitals compare to the mediocre safety performance that pervades the industry. Moreover, the new law applies only to acute care hospitals, leaving out nursing homes and other long-term care facilities. It will only include harm to Medicare patients, a subset of the overall population. And the system will not be able to capture some of the most common types of injuries to patients, such as those caused by medication errors.
The provisions of the Affordable Care Act are a step in the right direction, but they don’t go far enough. Implementing and operating a nationwide system that captures all harm to patients also requires that the U.S. health care system at last move out of the nineteenth century and replace paper records with open-source, truly integrated information technology of the kind the VA has pioneered. Electronic medical records, if they are written in compatible, open-source computer languages, have the potential to form vast databases that researchers, regulators, and practitioners themselves can easily mine to spot dangerous or ineffective practice patterns. Unfortunately, though many health care providers are busy installing health IT using federal stimulus dollars, most are installing propriety software that will leave data locked in “black boxes” and that have limited value in promoting transparency. (For more information on this subject, see Phillip Longman, “Code Red,” July/August 2009.)
Done right, a fully digitalized and integrated medical record system would also by itself prevent many serious errors, such as the thousands that occur every year when pharmacists misread a doctor’s scribbled prescription. Lest you think such matters are no big deal, the Institute of Medicine estimates that the average hospital patient in the U.S. is subject to at least one medication error per day (wrong med, wrong dose, wrong time, wrong patient), and that the financial cost of treating the harm done by these errors conservatively comes to $3.5 billion a year. An integrated digital records system would also make it much easier to monitor and curb the overuse of treatments that are both costly and dangerous. For example, Americans are exposed to so many CT scans, many of them redundant, that, according to the New England Journal of Medicine,the resulting radiation exposure may be responsible for as much as 2 percent of all cancer deaths in the country.
With such a robust, data-driven system of safety promotion at last brought to bear in health care, average performance will no longer seem good enough. Health care providers, employers choosing health care for their workers, and patients seeking the best care will all demand more. The benchmark for any given hospital to meet would thus become what it should have been all along: the refusal to tolerate even one case of preventable harm to a patient. Without such demonstrable standards of performance, there is little hope that the quality of health care can improve—whether the system is “socialized,” “market driven,” or any combination thereof.
Some doctors and hospital administrators will object on principle. When O’Connell, aka “The Numerator,” asked his surgeon about the moral implications of billing patients for treatments made necessary by sloppy medical practice, the response he reports receiving was disheartening: “We’re like lawyers,” O’Connell recalls the surgeon saying. “We just provide services by the hour and sometimes it works and sometimes it doesn’t.”
Other medical providers live by a higher standard than this, yet many will still raise all kinds of methodological objections. They will say that their patients tend to be much sicker or older than those treated in other hospitals. Or that the reason their hospital has such high infection rates is that many of their patients come from nursing homes, where lethal bacteria are rampant. (In the case of our investigation, I always pointed out that we were reporting the infections that their own employees had marked as not present at the time the patient arrived, meaning they were acquired in the hospital itself.) And to be sure, certain risk adjustments do need to be made in comparing the performance of one hospital with another.














