Latest Hospital Ratings Not as Straightforward as They Appear – Valley News

Lebanon — Giving consumers a simple tool to guide them in seeking better health care was the purpose when the U.S. Centers for Medicare and Medicaid Services this week issued star ratings for 3,662 American hospitals.

At least CMS got the “simple” part right.

The star ratings are intended to “provide consumers with information about multiple dimensions of quality in a single score,” Pierre Yong, acting director of CMS’ quality measurement and value-based incentive group, said in a recent conference call.

So each of the hospitals for which CMS collected data and did analysis now sports a star rating, with five stars representing the best quality care and one star the lowest.

But consumers who try to understand just what the stars show and find ways to use that information to shop among health care providers may end up wondering where the simplicity went to.

That’s certainly the view of the American Hospital Association, which sought to delay the awarding of the stars until CMS could further refine its data-gathering and analysis methods. “The new star ratings program is confusing for patients and families trying to choose the best hospital to meet their health care needs,” said AHA Chief Executive Rick Pollack.

Using the star ratings presents some special challenges and paradoxes in a rural area such as the Upper Valley, where Mary Hitchcock Memorial Hospital, a 396-bed teaching hospital in Lebanon, got three stars, while tiny Valley Regional Hospital in Claremont got four.

None of the 102 hospitals that received five stars, the highest rating, were in the Twin States.

And variations in hospital data collection and analysis systems and the complexity of the statistical tools used to parcel out stars make it difficult for laymen to assess just what is being compared.

Consider Valley Regional and Mary Hitchcock, which is the flagship hospital of the Dartmouth-Hitchcock regional medical system. Valley Regional’s rating was computed using no more than 12 of the measures that CMS relied on to assess the timeliness, effectiveness and safety of care, and the rates at which people with certain serious diseases died or had to be readmitted after being discharged from the hospital.

By comparison, CMS’ website shows 57 measures of Mary Hitchcock’s performance.

Spokesmen for both hospitals were at pains not to draw conclusions from the smaller hospital’s higher star score. “It’s really kind of an unfair comparison,” said Gaye LaCasce, Valley Regional’s senior director of community engagement.

The CMS analysis does not take into account “the complexity and acuity of cases, and penalizes tertiary care teaching hospitals like Dartmouth-Hitchcock, which sees the region’s most acutely ill patients and which serves higher numbers of the poor,” said Rick Adams, a D-H spokesman.

In part, Valley Regional out-starred its larger competitor by getting more favorable responses to nine of 11 questions from patient surveys at the hospitals. More patients at Valley Regional than those at Mary Hitchcock reported their doctors always communicated well (85 percent to 80 percent), their rooms were always clean (81 percent to 70 percent) and that it was always quiet outside their room at night (50 percent to 40 percent).

But 83 percent of the patients at three-star Mary Hitchcock said they would definitely recommend the larger hospital, compared with 73 percent at four-star Valley Regional.

CMS also awarded four stars to Cottage Hospital in Woodsville without even having data from a patient survey. In fact, CMS had only 11 measures of the quality of care at Cottage. In 10 of those measures, Cottage matched Mary Hitchcock. In the 11th — success preventing C.diff infections — Mary Hitchcock did better than Cottage. But Cottage got four stars, and the larger hospital only three.

“I throw Dartmouth-Hitchcock out of the equation when I compare us” to other local hospitals, said Maryanne Aldrich, a Cottage spokeswoman. D-H is a “big brother” to Cottage and other small Upper Valley hospitals, and a close relationship with D-H helped Cottage garner its four-star rating, she said.

Other Upper Valley hospitals didn’t fare as well, including three of the four Twin State hospitals that got only two stars from CMS: Gifford Medical Center in Randolph, Springfield Hospital in Springfield, Vt., and New London Hospital in New London.

Alice Peck Day Memorial Hospital in Lebanon got three stars, while Mt. Ascutney Hospital in Windsor was not rated because CMS had insufficient data from the facility. Veterans Affairs Health System hospitals, such as the one in White River Junction, do not submit data to CMS.

Barbara Quealy, the chief operating officer at Gifford, said the star ratings are “an incomplete assessment of any hospital.” Gifford recently opened a new birthing center and is in the process of making technology upgrades and converting shared rooms into private rooms, she added.

“Two stars does not adequately represent the organization that we are,” she said.

Karen Zurheide, the vice president of community relations and development at New London Hospital, said the hospital voluntarily participated in the star rating program but did not consider the end product an accurate measure of quality. “The calculations are confusing and do not properly recognize key differences among institutions, such as hospital size,” she said. “In particular, several calculations for New London Hospital were based on very small numbers of patients, contributing to our two-star rating.”

Bob Demarco, the chief of quality and systems improvement at Springfield Hospital, said that the star ratings system and its 64 measures did not accurately portray care at the hospital. “Many critical access hospitals like Springfield Hospital are not included in many of these measures due to a naturally occurring lower patient census,” he said. “Comparing hospitals that might be participating in many of the 64 measures to many critical access hospitals who can only participate in few of these measures hardly provides an accurate rating.”

Critical access hospitals are rural facilities with no more than 25 beds that receive enhanced reimbursement from CMS when they care for Medicare patients. CMS administers Medicare, the federal health insurance program for seniors and some people with disabilities, and oversees the federal role in Medicaid, a health insurance program through which states provide coverage to some low-income residents.

In recent years, CMS has sought to use the power of the Medicare purse to prod hospitals to improve the quality of care and eliminate wasteful spending.

That has created a plethora of measures and at least three new reimbursement mechanisms designed to reward hospital quality. Many of those measures and mechanisms are laid out in publications and available online.

But although CMS created a website called Hospital Compare where consumers could have access to that data as they sought to find and assess health care providers, using it hasn’t been easy.

“The current information on (CMS’ www.medicare.gov/hospitalcompare/search.html website) can be fairly technical and can be intimidating to beneficiaries,” Yong said.

That’s where the star ratings came in. They were intended, according to Arjun Venkatesh, physician and assistant professor at the Yale University School of Medicine, to “summarize hospital quality into a single star rating and convey information that’s already available on Hospital Compare in a straightforward and accessible manner for patients and consumers.”

The stars seem pretty straightforward.

Not so the statistical model, which involves “winsorization,” a “latent variable approach” and a “clustering algorithm” — for those of you playing along at home — that CMS used to convert data gathered from hospitals and the results of patient surveys into a single score represented by one to five stars.

James O’Malley, a professor of biomedical data science at Dartmouth College’s Geisel School of Medicine, noted the challenges of constructing a rating scheme that is “statistically accurate and precise.” Hospital quality is not a physical object that can be measured one day, and then remeasured the next day with the same result, he said.

The CMS website shows lots of measures of the care at Mary Hitchcock. The local giant significantly outperformed hospitals statewide and nationwide in patients’ colonoscopy follow-ups, staff member influenza vaccination rates and in the survival and readmission rates of heart attack patients.

But there were problems in the Mary Hitchcock emergency department. The median wait time for pain medication for patients with broken bones in the local facility was 86 minutes, compared with less than an hour statewide and nationwide. Wait times for inpatient admissions from the Mary Hitchcock’s emergency department were much longer, as was the average wait time to see a doctor or other professional.

O’Malley, the Dartmouth statistician, urged consumers to look beyond the stars: “You should always consider other factors.”

A CMS guide to choosing a hospital is posted online at www.medicare.gov/Pubs/pdf/10181.pdf. In addition to outlining how to use the Hospital Compare website, the guide lays out questions to discuss with a patient’s doctor and stresses the importance of reviewing insurance coverage.

And no hospital is perfect, LaCasce noted: “Unless you score 100 percent on everything, there’s always room for improvement.”

Rick Jurgens can be reached at rjurgens@vnews.com or 603-727-3229.