Study questions impact of quality report cards
■ As hard-to-understand public reports proliferate, there is a danger that patients will experience information overload.
Twenty-one states have mandated hospital infection reporting in the last four years, and 221 health care quality report cards are listed on a Health and Human Services Web site.
Last month alone saw new quality reports released in New Jersey, Minnesota and the Seattle area.
The premise behind this wave of public reporting is that transparency will spur doctors and hospitals to improve quality and safety while giving patients valuable data to help them decide where to seek care. The concept has widespread acceptance, yet it is also largely untested and unproven.
Since 1986, 45 studies have examined the impact of public reports on quality and safety. But while such reports appear to stimulate quality activity in hospitals, there is little evidence to show they improve the effectiveness, safety or patient-centeredness of care. They also can have unintended consequences, such as discouraging doctors from treating sicker patients.
That is the verdict of a systematic review of those studies in the Jan. 15 Annals of Internal Medicine. The review echoes what many experts have said about the rush to report publicly doctors' and hospitals' performance on quality metrics -- the scientifically valid measures that are available can be confusing to the public and represent only a small slice of what happens in hospitals and medical practices.
"Report card advocates are going to develop back problems because they are patting themselves on the back a little too hard," said David Dranove, PhD, an expert on public reporting and professor of management and strategy at Northwestern University's Kellogg School of Management in Evanston, Ill. "We need to slow this down. Public reporting is the most important movement in health care, and we need to get it right."
The information now being reported is "insufficient," and researchers "haven't done enough to learn about how to present quality data to consumers so they can become intelligent users of the data," Dr. Dranove said. "We run the risk that people will learn to ignore report cards."
Even doctors who support public reporting say implementation of reports needs to be tested just as any other health care intervention should be.
"One of the interesting ironies is that a lot of health care quality work is based on the notion of evidence-based medicine," said David A. Asch, MD, a general internist and professor of health care management and economics at the University of Pennsylvania School of Medicine. "Systems like public reporting should be evaluated for their impact on quality. We shouldn't take these well-meaning attempts for granted, and we shouldn't assume that public reporting is good."
Evaluation of report cards lacking
For the Annals article, researchers from RAND Health in Santa Monica, Calif., and the Veterans Affairs Greater Los Angeles Healthcare System examined peer-reviewed studies evaluating the impact of public reporting on the quality of health plans, hospitals and physicians.
The study found that "evidence is scant, particularly about individual providers and practices" and that "rigorous evaluation of many major public reporting systems is lacking." The authors said "the effect of public reporting on effectiveness, safety and patient-centeredness remains uncertain."
Public reports' negligible impact is unsurprising because quality measurement is still in its infancy, said Bruce Bagley, MD, the American Academy of Family Physicians' medical director for quality improvement.
"There are very few situations where you have a wide variety of measures to fully evaluate the practice of a physician," he said. "If you're trying to judge the value that a particular family physician brings to your overall health and well-being, we're not even close to that."
Public reporting advocates argue that the premise is sound but that the execution has been subpar.
"We haven't done a very good job of the way we have designed public reports," said Judith H. Hibbard, PhD, a professor in the University of Oregon's Dept. of Planning, Public Policy & Management. "To say [public reporting] doesn't work is premature. Let's get it right first."
Wisconsin's Employer Health Care Alliance QualityCounts hospital performance public report shows that the idea can work, said Dr. Hibbard. She wrote an editorial in Annals responding to the review and co-authored a 2003 Health Affairs study about the Wisconsin effort.
Hospitals that report publicly through QualityCounts engaged in more quality improvement activities in cardiac and obstetric care, the study found. The Wisconsin report differed from many others because it was publicized widely and designed carefully to be "highly evaluable" by consumers -- illustrating variations in care, ranking hospitals by performance and highlighting the best-performing hospitals, Dr. Hibbard said.
"Usually, when these public reports are put together, the people around the table are mostly from the provider community, and their top concern is making sure that all the data are accurate and fully explained with lots of caveats," she said. "The whole idea of boiling it down, interpreting it and making it easy to take in does not end up high on the agenda in what gets produced."
But physician and hospital organizations said patients can be ill-served by public reports if they unfairly simplify the quality picture. American Hospital Assn. spokesman Matthew Fenwick said the AHA does not oppose public reports but believes they "need to be done in a careful format" that does not discourage physicians and hospitals from reporting.
The AMA is involved with the AQA Alliance, a multi-stakeholder group working to develop physician performance measures for public reporting. In 2006, the Association adopted policy urging the AQA to "avoid a rapid development and implementation of its agenda to ensure that adequate time and consideration is allowed to evaluate and endorse performance measures that serve patients within the framework" of the AMA's principles on pay-for-performance and public reporting.
AMA policy states that "fair and ethical" pay-for-performance programs should "use accurate data and scientifically valid analytical methods." The programs also should allow physicians to "review, comment and appeal results for programmatic reasons and any type of reporting."