Showing posts with label ratings. Show all posts
Showing posts with label ratings. Show all posts

August 01, 2016

The Emperor Has No Clothes

I’ve been studying Medicare’s new Hospital Compare website. Lots of people have complained about this particular ranking, which gives only 102 hospitals in the country five stars, some of them fairly obscure institutions. I’ve argued in the past that rankings are often misleading, that institutions try to game the system, and that they are often based on measuring the wrong things. But I was curious about how Boston area hospitals, hospitals that I’m familiar with, actually performed. I was particularly interested in how they compared to other hospitals in the country in those domains that Medicare chose to examine. The bottom line is that they didn’t do very well.

Not a single hospital earned five stars. The only major teaching hospital to earn four stars was Massachusetts General Hospital (MGH). And I was curious about its weaknesses: CMS reports two, in readmissions rates and in the timeliness of care. Now I’ve suggested that there may be an irreducible minimum readmission rate—the frailest, sickest patients are going to get sick again, no matter what kind of care they get either in the hospital or after they return home. The only way they aren’t going to be readmitted is if they are offered, and agree to, care exclusively at home (for example, home hospice). And unless we provide more ways that the frailest and sickest can get care at home (aside from hospice, for which not all will be eligible and not all those who are eligible will elect), and unless we discuss their goals of care and how best to achieve them, they are going to return to the hospital when they get sick again. Which they will. But it’s nonetheless striking that MGH—and every single other major teaching hospital in Boston—did worse than the national average in readmissions. That's not a problem with hitting an irreducible minimum. That's a problem achieving the achievable.

MGH’s other Achilles heel, timeliness of care, was also a problem for all the other leading Boston hospitals. Both these deficiencies suggest that the hospitals are not doing a good job of working with primary care doctors and community agencies to coordinate care, to make sure that whatever needs to get done is in fact done. That's a problem for geriatric care.

The other two principal teaching hospitals of Harvard Medical School, the Beth Israel Deaconess Medical Center (BIDMC) and the Brigham and Women’s Hospital (BWH), only managed to get three stars each. In addition to problems with readmissions and timeliness of care, they had assorted other difficulties. BIDMC’s “effectiveness” was on par with the national average, but no better. It did not demonstrate the efficient use of medical imaging. And BWH was below the national average in effectiveness and in safety. That’s disturbing.

The major teaching hospital of Boston University, the Boston Medical Center, also got three stars. It was the only large hospital that did worse than the national average in the domain of the "patient's experience," or how patients rated their stay. The principal teaching hospital of Tufts University, New England Medical Center, only got two stars, with problems in safety, readmissions, timeliness of care, and the efficient use of imaging. Not very impressive.

Two community hospitals, Faulkner and Newton Wellesley (both in the Partners orbit, the hospital system that owns MGH and BWH) got four stars. This result is a bit perplexing as Newton Wellesley, for example, was actually at (not above) the national average in safety, readmission, effectiveness of care and efficient use of imaging, and below the national average in timeliness. Evidently a bunch of B’s and only one C is deemed worse than a bunch of A’s and two C’s. The process of lumping all these measures together to get one final grade seems to me to lead to a misleading conclusion.

So I still don’t think it’s reasonable to conclude very much from the conglomeration that goes into coming up with a single rating. But I do think that observing that every single hospital in the Boston area was below the national average in at least one area and most of them, including the most prestigious institutions, were below the national average in several, is sobering. The areas Medicare chose to focus on are important for quality. There is no good reason for Boston institutions to have more difficulty with any of these measures than the national average. Boston, shape up!

January 03, 2016

For Patient's Sake!

For the last few days I’ve been trying to identify an average skilled nursing facility. Just your typical “rehab,” the place you are likely to be sent after a hospitalization if your are over 65 and had a hip replaced or perhaps a heart attack or a stroke. Some place with dedicated “post-acute” beds, with or without a long term care section where people live out the duration of their lives once they can no longer live independently or in assisted living. Not a “teaching nursing home,” one of a handful of academic institutions that is affiliated with a long term care facility. Not a giant nursing home with 500 or more beds, nor a small ten-bed unit that’s within a hospital. Just a regular SNF, preferably in the Midwest. I’m working on a book about the American health care system, one section of which is about post-acute care, and I’d like a short vignette describing an average SNF. I have plenty of stories of patients’ experiences in a SNF but for purposes of my narrative, I’d like to describe a run-of-the-mill SNF. I’d prefer it not be in the Boston area because my book is already too Boston-centric. And I’ve already featured a hospital in Florida and a physician group practice in California, so for geographic balance, I’d prefer a facility in the middle of the country, preferably in an urban location (most facilities are in cities). It should be a for-profit institution because 70% of SNFs are for-profit. A 150-bed free-standing building in Illinois (Chicago would be good), Michigan (Detroit would be excellent), or Ohio (Cleveland would be perfect), owned by one of the major national chains such as Genesis or HCR ManorCare or Kindred would be ideal. I found quite a few that meet my criteria—but what’s really disturbing is that I can’t find out much about any of them.

I’m going to have to interview the director of nursing or the medical director or the administrator at Average Nursing Home. I may have to visit the facility. But I’d like to get some background information first. And I need to know whom to contact. My problem is that it’s almost impossible to find what I’m looking for and that means it’s almost impossible for prospective patients or their families, too. It means that accountability in these facilities, to which about 20% of older patients go after they leave the hospital, is largely absent. That’s disturbing.

I’ve looked through dozens of websites in the last few days and I have learned quite a bit about nursing home chains. I’ve learned that each chain comes up with a brief and none-too-informative description of its SNFs and essentially uses the same description for every one. They use the same photos, too: evidently there is a generic “dining room photo” and a generic “exercise gym photo.” I’ve learned that they believe that they are marketing the buildings and their equipment, not the people who run the buildings or who provide the clinical care. A bright and clean building with corridors wide enough to accommodate wheelchairs and walkers is nice, and modern exercise apparatus is desirable, but most important are the nurses and the certified nursing assistants who take care of the patients. And there isn’t a word about who actually works at the SNF. The only exception is Genesis Healthcare, the largest of the chains, which has a tab for “staff” on its websites and lists the administrator, director of nursing, admissions director, and sometimes the rehab director, along with some of their credentials. No email addresses, but the facility has an address and a phone number, so it’s possible to track these people down. Even Genesis doesn’t list the medical director, the physician who is required by law to be in charge of assuring that the facility meets certain standards of care.

Maybe I’m just spoiled—I’ve come to realize what an extraordinary wealth of information is readily available for other parts of our health care system, about hospitals and group practices and health insurance companies. Hospital websites, even though they are fundamentally about PR, include the names of the physicians on staff. You can look up how many cardiac surgeons and orthopedists are affiliated with a given hospital and you can find out where they went to medical school or did their residency. You can track down whether they have lost malpractice suits. Local newspapers often have articles about new developments at their community hospitals—new programs, new systems of care, new rankings, and of course new scandals. But about SNFs—hardly a word. When the Department of Justice accused several nursing home chains of bilking Medicare of billions of dollars by charging for “intensive” therapy services from which patients couldn’t possibly benefit—some of them were moribund—that rightly made national news. When a new SNF opened in a small town, that also made the news, principally because it was seen as a source of new jobs. But that’s it. Why? Why is there so little publicly available information about skilled nursing facilities?

If you look at a list of the largest nursing home chains in the US, you will find Genesis Healthcare (#1) is now publicly traded—but only since February, 2015 (it was taken private in 2007). HCR Manorcare (#2) is owned by a private equity firm and both Golden Living (#3) and Life Care Centers of America (#4) are privately held. Kindred (#10) is publicly traded. The private corporations have no incentive to have anything other than a sanitized public image. The publicly traded firms are accountable to their stockholders rather than to patients. If you want to find a facility that is reasonably forthcoming about its operations, you have to look at the non-profits.

This little exercise in futility gave me a far greater appreciation for Nursing Home Compare, Medicare’s website that offers the consumer information about nursing home quality. In the past, I’ve made fun of the five star rating system used by the site and criticized the choice of quality measures: for short stay facilities, the 5 quality indicators used are the proportion of patients who received a flu shot, the proportion who received a pneumonia vaccination, the proportion of patients with a new or worsening pressure ulcer, the proportion of patients newly prescribed an anti-psychotic medication, and the proportion who report moderate to severe pain. I was impressed by a NY Times article in 2014 detailing how nursing homes can game the system and win a five-star rating even when they offer abysmal care. But the latest version of the rating system, which went into effect in February, 2015, relies on actual independent measures of things such as staffing ratios, rather than on the nursing home’s self-report, and is both more reliable and more accurate. 

Nursing Home Compare doesn’t tell the whole story, but it provides an important piece of the story. We need investigative journalists shining a light on this industry and we need more transparency from the institutions themselves. We need to pay more attention to what goes on in skilled nursing facilities, for the patient’s sake.

October 19, 2014

Rating the Ratings

Earlier this month, Medicare announced that it is revising the 5-star rating system currently used to measure nursing home performance. The ratings are available on the Nursing Home Compare website, which allows consumers to learn how a facility they might be considering going to shakes up relative to others. The problem, and the reason the system is being revised, thanks to funding from IMPACT (legislation passed in September), is that it’s not so clear that the vaunted rating system measures anything meaningful.

An investigative piece in the New York Times this past summer dramatically demonstrated the gap between reality and the ratings. The reporter visited the Rosewood Post-Acute Rehab outside Sacramento, California, an attractive facility that had garnered the much sought after 5-star rating from Medicare. But it turned out that the ratings focused entirely on annual health inspections and on 2 measures reported by the nursing home itself—staffing ratios and a quality of care index. The rating left out data from state authorities, even though it is those authorities that supervise the nursing home. In Rosewood’s case, the state of California had fined the facility $100,000 in 2013 for a patient death attributed to inadequate medication monitoring. California had also received 102 patient and family complaints between 2009 and 2013, way over the state’s average. And the facility had been the subject of a dozen lawsuits alleging substandard care. The revised rating system, by drawing on external audits of nursing home quality and electronically submitted staffing data, as well as by incorporating some new measures such as the proportion of residents taking antipsychotic medications, hopes to overcome the shortcomings of the current approach. But will it?

Nursing Home Compare is not the only attempt to come up with a single, composite rating for medical facilities, and nursing homes are not the only medical institutions to be graded in this way. Hospitals are also rated, and multiple organizations offer assessments. I recently stumbled on a fascinating case: in June of 2012, the Leapfrog Institute, a non-profit think tank devoted to measuring and improving safety in medicine, came out with its first hospital ratings. It awarded an A or B to 56% of the hospitals surveyed, a C to 38%, and grades below C to 6%. The UCLA Medical Center was given an F.  At the very same time, US News and World Report came out with its annual hospital rankings. In this report, the UCLA Medical Center was ranked #5 in the country. How can the same hospital get an “F” and an “A+” for its performance? And if you think that maybe UCLA is an unusual case, it’s not. Consumer Reports, which also got into the hospital rating business, ranked MGH below average in the same year (2012) that US News ranked MGH as #1 in the country. 

The answer to why different raters have different results is that the grade depends on the methodology used to compute it. Leapfrog assesses hospitals based entirely on performance in the realm of safety and does not adjust for the severity of illness. Consumer Reports uses a complicated mixture of a safety score, a patient outcomes score, a patient experience score, a hospital procedures score, and then a rating of heart surgery, with several factors going into each of the subscores. US News and World Report looks at outcomes (by which it means mainly mortality), process (which is largely nursing staffing levels), and other factors (a big part of which is reputation).  US News also rates hospital departments (neurology, cardiology, oncology, etc). I was particularly amused a number of years ago when US News ranked the Geriatrics Department of one of the Boston teaching hospitals among the top 10 in the country. It so happened that hospital didn’t have a geriatrics department.

Americans like report cards.  We rank toasters and washing machines and cars. We rate hotels and restaurants and auto mechanics. We have institutions devoted to product evaluation (think Consumer Reports) and thanks to the Internet, we now have a slew of informal, popular evaluations (think Yelp or Trip Advisor). I admit I find these reports very useful: when I was looking for a good bed and breakfast recently, I found it helpful to learn that 50 people gave one particular inn 5 stars. I could also read the individual comments to get a sense of whether the aspects of the inn that other travelers liked were of any particular concern to me.  But can we really come up with a report card for a hospital or a nursing home? Can we really reduce performance to a single grade?

Nursing homes and hospitals will inevitably game the system, just as colleges did when US News and World Report used the number of applications/number of offers of admission as a measure of selectivity. Colleges instructed their admissions officers to travel around the country encouraging students to apply, even if those students couldn’t possibly be accepted, because the more students applied, the more “selective” the college became. Some colleges created a huge waiting list and admitted many of their freshman class from the wait list—but only counted the initial acceptance letters in the computation of “offers of admission.” Some students and families have caught on and the media has started to downplay the annual US News numbers—for the past couple of years, the college rankings haven’t been front page news in the NY Times when the new ones are released. But colleges continue to regard the rankings as important and use them in marketing. Similarly, I’ve noticed big banners in the vicinity of some of Boston’s hospitals proclaiming their latest ranking. And I learned from a terrific piece of investigative reporting produced by Kaiser Health News 
in collaboration with the Philadelphia Inquirer that the hospitals pay to advertise their rankings. US News, Leapfrog, and another rating organization, Healthgrades, charge licensing fees to hospitals for the privilege of trumpeting their “achievement.” These fees are not peanuts: Healthgrades charges $145,000, US News charges $50,000 and Leapfrog charges $12,500.

There are now so many rating agencies, using very different rating scales and arriving at widely discrepant results, that there is even an organization, the Informed Patient Institute, that grades the raters. But the truth is that it is impossible to distill the performance of a complex institution such as a hospital or a nursing home to a single measure. Such efforts will inevitably hide the very real variability in performance depending on just exactly what is looked at. What you need to know depends on why you need to know it. Are you an insurance company, deciding whether or how much to reimburse a facility for a particular service? Are you a patient choosing a hospital (actually, you probably won’t have much say in the matter; in case of emergency, you will be taken to the nearest facility; and in other somewhat less urgent situations, where you go is typically determined by who your doctor is). Are you a patient or family member choosing a nursing home for long term care (you may have a fair amount of choice)? For short term rehab (you will have less choice in the matter)?


So will the revised ratings of nursing homes (coming in January, 2015) make grades meaningful? Probably not. Requiring nursing homes to report data on staffing electronically will likely improve the accuracy of their reporting—but is the degree of improvement worth the millions of dollars that will be spent on this? Including the rate of antipsychotic medication prescribing as a quality indicator might tell us something about whether nursing homes are unnecessarily and inappropriately sedating their residents—assuming the measure has corrected for the rates of serious psychiatric illness in the facility. The bottom line is that a single grade cannot capture all the features of a medical facility’s performance that are relevant to all the different individuals and groups for whom the ratings are intended. It’s time to abandon composite ratings.