July 17, 2017

The Secret to Staying Sharp

The last time that NIH requested a review of the data on preventing cognitive decline in old age (including Mild Cognitive Impairment, Alzheimer’s type dementia, and “usual” age-related cognitive deterioration) was in 2010. At that time, the systematic review of the published literature (performed by the Agency for Healthcare Research and Quality) and the associated state of the science conference (convened by NIH) concluded there was insufficient evidence to make any recommendations about interventions to prevent cognitive decline and dementia.

Now, the NIA has asked the National Academies of Science, Engineering, and Medicine to commission a new systematic review of the data and, based on that review, to issue recommendations about prevention. Its report, optimistically entitled, “Preventing Cognitive Decline and Dementia: A Way Forward,” was just released.  Alas, while the commission bent over backwards to find beneficial interventions, adding observational, non-experimental studies, risk factor analysis, and neurobiological work to the randomized controlled trials (RCTs) that were supposed to provide the evidence for their conclusions, it was forced to conclude, once again, that  the review “identified no specific interventions that are supported by sufficient evidence to justify mounting out an assertive public health campaign to encourage people to adopt them for the purpose of preventing cognitive decline and dementia.” The best the group could come up with was that the review did “find some degree of support for the benefit of three classes of intervention: cognitive training, blood pressure management in people with hypertension, and increased physical activity.

If we examine these three domains, what we find is not entirely encouraging. The arena of cognitive training (brain games, crossword puzzles, studying a foreign language, etc.) had the greatest degree of evidence. There is good evidence that it can improve performance in a trained task—that is, if you work at generating synonyms for words over and over again, you will get better at finding synonyms, at least in the short term. What is less clear is whether the benefits are sustained, whether training in one domain yields benefits in other domains, and whether it translates at all into improvement in daily functioning, in areas such as shopping, cooking, or paying bills. The good news, such as it is, about cognitive training, derives principally from one study, the Advanced Cognitive Training for Independent and Vital Elderly (ACTIVE), which provides moderately strong evidence of effectiveness in the training domain after 2 years but low strength evidence after 5 or 10 years. The improvements that were found failed to translate into areas other than the one where training was provided.

Perhaps surprisingly, given the strong evidence that blood pressure control in people with hypertension is beneficial in preventing stroke and coronary artery disease, vigorous blood pressure treatment did not so readily translate into prevention or delay of any form of cognitive decline in old age. One British study did show efficacy. Given that blood pressure treatment is already recommended for other reasons, encouraging its use in the hope that it might also help fend off cognitive decline evidently seemed harmless enough to the committee.

The story on exercise is similar to that on blood pressure control: the RCT data are inconsistent, but there’s at least some data that shows a positive effect. Exercise studies are problematic because they so often utilize different forms of exercise and prescribe varying duration and frequency of exercise. Nonetheless, given the evidence that exercise is useful to promote mobility and to prevent depression, and that some studies find it beneficial in preventing cognitive decline, the committee opted to include exercise in its short list of interventions for which there is “some degree of support.”

The main justification, it seems to me, for subtitling this report “A Way Forward” is the section on recommendations for future research. The areas that have shown some promise deserve further study. And that study, as well as all other avenues that might be pursued, should be methodologically sound. That means acknowledging the deficiencies of existing work and avoiding those flaws in the future.

I suppose that whether this report is encouraging or not depends on whether you are a glass half full or half empty sort of a person. I will certainly continue to exercise regularly and challenge my mind, as long as I am able to. If I develop high blood pressure, I’ll want it adequately controlled. But I won’t kid myself that any of these measures will get me off the hook. And I will continue to support ongoing research in preventing or delaying cognitive decline in old age. But I won’t hold my breath. So far, the secret to staying sharp is that there isn't one.

July 09, 2017

Transfers Redux

Last week, I praised MedPAC for devoting an entire chapter of its June report to Congress to strategies for decreasing transfers from nursing homes to acute care hospitals. Some of the pilot projects reported had successfully reduced potentially burdensome, unwanted, and costly hospitalizations of the frail, very elderly population who live in nursing homes. So I was dismayed to read this week about a follow-up study to one of these pilots, a large, randomized controlled trial—that failed to produce any change whatsoever.
Robert Kane, the lead author, who sadly passed away several months before the publication of the article, was a prolific, influential, and thoughtful scholar of geriatrics in general and long term care in particular; Joseph Ouslander, the senior author, is likewise a giant in the field of geriatrics, who has similarly focused much of his research on long term care. Their report of the INTERACT (Intervention to Reduce Acute Care Transfers) trial, begins by observing that the core principles underlying the study are 1) early recognition and proficient management of acute conditions has the potential to prevent the progression of disease to the point where hospitalization is deemed necessary; 2) the availability of communication, documentation, and decision tools can facilitate care by advanced practice clinicians; and 3) an emphasis on advance care planning, hospice, and palliative care can lead to a higher frequency of “do not hospitalize” orders. Encouraged by the results of their non-randomized pilot study, which demonstrated a 24 percent decrease in all-cause hospitalization among residents of study nursing homes (facilities that volunteered to participate) during the study period compared to baseline, rather than a 6 percent decrease in control facilities, the authors developed a larger, randomized controlled trial to further test the effectiveness of their program. The new program relied on webinars and on line courses to educate nursing home staff and monthly phone calls for support.
The authors reported on 227,140 person-years of observation in 264 nursing homes (randomized to intervention homes, usual care homes, and usual care plus phone contact). Careful statistical analysis revealed no difference in the overall hospital admission rate, the potentially avoidable hospitalization rate, or the rate of Emergency Department visits. No effect at all.
       So was the conclusion I arrived at last week—that people living in nursing homes will forgo hospital care if they are offered a viable alternative and if they (or their surrogate decision-makers) understand both their overall health status and the perils of hospitalization—totally unjustified? Maybe. But maybe not. 
       All we know is that the essence of the project involved educating nursing home personnel to allow them, in principle, to provide more on site care. We know that core staff members, who were obligated by the terms of the study to complete all training modules, in fact only attended 67 percent of the webinars and completed only 52 percent of the online courses; they also only participated in 52 percent of the monthly supportive/feedback phone calls. We know that when push came to shove, either patients or families wanted to go to the hospital, staff members wanted to send them, or both.
       What is far from clear is whether the INTERACT intervention actually improved the quality of care available on site, whether residents and families were aware of and trusted in the improvements, or whether any staff members in fact spoke to patients and families about their state of health, explored goals of care, or offered either hospice or palliative care services. 
       Before we abandon the effort, let’s be sure that the training truly “took,” both in the sense of better capabilities (on the staff side) and of heightened awareness (on the resident and family side). As the authors acknowledge, maybe distance learning is not the right way to teach new knowledge and skills. 
       It's premature to conclude that an approach to decreasing transfers is a failure just because the educational intervention on which it is based was unsuccessful. Before making that leap, we need to be sure that the educational effort truly translates into more advance care planning discussions, more widespread detection and treatment of acute medical conditions, and institution-wide familiarity with the changes. There's still hope that better on site medical care and advance care planning will ultimately reduce the transfer rate from the nursing home to the hospital. But first we need to figure out how to provide reliable, competent, trustworthy nursing home-based treatment and then we need to develop a system of advance care planning that builds on the availability of this kind of treatment.

July 02, 2017

To Transfer or Not to Transfer

With the Senate’s attempt to repeal and replace the Affordable Care Act temporarily on hold, I turned my attention to the MedPAC report sent to Congress earlier this month, its annual report on “Medicare and the Health Care Delivery System.”  MedPAC, I once said, is one of the most-important-organizations-you-never-heard of. It is an independent group of 17 commissioners appointed by the US Comptroller General that advises Congress on the Medicare program. Some of Medicare’s most influential programs in recent years, such as the readmissions reduction program and the hospital acquired conditions reduction program, had their origins in MedPAC recommendations.

One  chapter that I found particularly intriguing was the one on hospital and SNF (skilled nursing facility) use by Medicare beneficiaries who reside in nursing facilities. Much attention has been paid to patients going from the hospital to home and back to the hospital, and a fair amount of attention accorded to patients going from the hospital to the SNF (rehab) and back to the hospital. But this section addressed a different population: the frailest 1.6 million people in America, those living in long term care facilities. It asked whether they were being appropriately transferred to the acute care hospital. As the MedPAC report noted, these are patients who often get into trouble when they are hospitalized: they are prone to falling, developing delirium, suffering from hospital-associated infections, and to experiencing the adverse effects of “polypharmacy,” the prescribing of multiple medications. These are individuals who live in an environment that provides nursing care 24/7 along with personal care, as well as access to physicians, prescription medications, and physical therapists. Surely it would be better for the nursing home residents—and for Medicare’s bottom line—if they could be treated where they live. Are they? If not, why not?

The answer is that they aren’t cared for in the nursing facility as often as they should be. The single most important factor determining if a person is treated in the nursing home or sent to the hospital is the availability of on-site medical care, both physicians (or advanced practice clinicians such as physician assistants or nurse practitioners) and diagnostic modalities (such as x-rays).

Suddenly this conclusion had a familiar ring and I remembered that 35 years ago, during my fellowship in geriatric medicine, I decided to study why nursing home residents were transferred to the acute care hospital. I spent many long hours in the emergency department of Boston City Hospital examining medical records—I didn’t stop until I had identified 100 patients who arrived in the ED from any of 22 area nursing homes. During the same period, 338 older individuals who lived in the community, in their own homes, had sought care, and these people served as controls. What I found was that the patients coming from the nursing home were remarkably similar to those coming from home in terms of their severity of illness. They were, on average, older (83 compared to 77), whiter (92 percent compared to 56 percent), and more likely to be female (64 percent compared to 51 percent). They were more apt to present with fever or a change in their mental status, both common problems with increasing age. But otherwise, the two groups looked very similar from a medical perspective. I concluded that we could increase the efficiency of medical resource utilization and promote better care if we simply improved on-site care in nursing homes. Almost exactly what MedPAC found in its analysis today.

To be sure, some nursing homes have programs in place that go a long way to rectifying the situation, and CMS has supported several pilot programs designed to avoid hospitalizing nursing home residents. These programs have several features in common: they enhance the treatment available in the nursing home by using advanced practice clinicians or providing in-service training to other staff members; and they encourage advance care planning by residents and their families to promote discussions of prognosis, preferences, and planning for future illness.

So why, after over three decades, do we still transfer many patients from the nursing home to the hospital? Why don’t we provide more on-site medical care? The reasons are complex and include an historical lack of interest by physicians in the frailest, oldest patients as well as poor reimbursement for nursing home medical care. But fundamentally, what the enduring problem shows is that we continue to fail to recognize that people in nursing homes—and their families—do want treatment of their medical problems. They may be willing to forgo the most invasive and burdensome forms of treatment—such as ICU care, ventilator care, and major surgery—but that doesn’t mean they are satisfied with a focus exclusively on comfort. If all the nursing home can offer is Tylenol and oxygen, perhaps along with morphine or other opioids, then nursing home residents will want to go to the hospital when they become acutely ill. 

We need to offer nursing home residents a viable alternative to the extremes of comfort care only, on the one hand, or maximally aggressive care on the other. And we need to explain what the various approaches to treatment would mean for them. Only then will we stanch the flow from the nursing home to the hospital.

June 25, 2017

The Worthy and the Unworthy

One of the most illuminating and insightful articles I ever read was written by historian of medicine David Rosner. Entitled “Health Care for the ‘Truly Needy’: Nineteenth Century Origins of the Concept.” I read it when it was first published and I’ve remembered it since—and that was 35 years ago. The nineteenth century concept of the “worthy poor” or “deserving poor,” and its Reaganesque reformulation is sadly reflected in the Republican health care bill revealed today.

Rosner points out that at a time of relative ethnic homogeneity in pre-industrial, pre-Civil War America, the poor were often seen, in the light of Christian teaching, as individuals who would be rewarded with salvation. As an added bonus, the presence of poor people gave the wealthy an opportunity for charity, which would likewise be rewarded. But then, in the second half of the nineteenth century, millions of destitute immigrants arrived on American shores. At the same time, Americans suffered from tremendous economic dislocation related to urbanization. As a result, “a general consensus developed among the native-born equating poverty...sinfulness, and individual failure with foreign birth. Conversely, wealth, American nativity, and material success were equated with righteousness and moral behavior.”

The Surgeon General of the US in 1891, Dr. John Shaw Billings, remembered for introducing the collection and maintenance of “mortality and vital statistics” records, also accepted the notion of a meaningful distinction between the worthy and unworthy poor saying “there is a distinct class of people who are…almost necessarily idle, ignorant, intemperate, and more or less vicious, who are failures…and who for the most part belong to certain races,” by which he meant Catholics, Jews, Irish, Italians, and Eastern Europeans. He accepted the need to provide medical care for this group—but only to prevent the spread of infectious diseases to the remainder of the population.

And then we have Dr. Stephen Smith, another public health giant, who cautioned that medical charity can be “the inlet through which the habit of pauperism first creeps into the poor man’s house.” That is, helping people who are poor fosters dependency and is to be avoided. Remember Romney’s 47 percent? The people who are “dependent on the government” and who should simply “take personal responsibility” for their lives?

After discussing the way that concepts of the worthy and unworthy poor evolved in tandem with the growth of the hospital in the early part of the twentieth century, Rosner concludes by arguing that “although the language used today is significantly different from the angry, moralistic, and class biased rhetoric of the nineteenth-century debates, there is a similarity of meaning and analysis in arguments over definitions of the ‘truly needy, over the proper eligibility criteria for a variety of health programs like Medicaid and Medicare, and for the scope of other social service programs such as food stamps and welfare.” He was writing in 1982, but he could equally well be writing today, as we learn who it is that the Republican senators, or at least those who crafted the latest version of the health care bill, deem worthy. Full-time employees of well-heeled companies are worthy and older people, provided they don't live in nursing homes, are worthy. It's unclear if fetuses are worthy: health plans may be excluded from the insurance exchanges if they cover abortion, but health plans may also be allowed (through a waiver) although they fail to cover maternity care. Everyone else, the senators assume, could purchase health insurance—or better yet, not get sick—if only they had the necessary moral fortitude.

This isn’t how any other democratic nations in the world view health, medical care, or their citizens. They assume that everyone is "worthy" of basic medical care. They regard it as the responsibility of government to promote the health of their citizens, just as it government's responsibility to keep them safe and educated. Tell your senator that  enshrining archaic concepts of worthiness into law by severely restricting access to medical treatment is not the way to keep America great.

June 18, 2017

The Other American Drug Problem

With all the attention paid to the opioid epidemic, another drug overuse problem has gone relatively unnoticed--the widespread use of antipsychotic medications in nursing home residents. A perspective article in JAMA this week focuses on this other drug problem—and an intervention that the authors think might just have solved it.
Interestingly, antipsychotic medications were a problem in an earlier era. Then along came OBRA87, or the Nursing Home Reform Act, mandating a variety of strategies limiting the use of drugs to sedate patients with dementia who had behavioral problems: nursing home patients were to be free of “chemical restraints;” staff were supposed to try non-pharmaceutical approaches before resorting to drugs, and they were expected to taper the medication after several months. The regulations seemed to be effective: the percent of nursing home residents receiving an antipsychotic fell from 34 percent pre-OBRA to 16 percent several years afterwards.
But after the atypical antipsychotics were introduced in the early 1990s, beginning with risperidone and then going on to a variety of other agents such as quetiapine and olanzapine, the rate of use began climbing again. By 2011, it had reached 24 percent among nursing home residents. Today, however, it’s back down to its historic low of 16 percent.
In their article, Gurwitz et al regard the turning point as the Office of Inspector General report of 2011, “Medicare Atypical Antipsychotic Drug Claims for Elderly Nursing Home Residents.” In response to this alarming report, the Centers for Medicare and Medicaid Services (CMS) developed a multi-pronged strategy to combat the problem. It launched its “National Partnership to Improve Dementia Care in Nursing Homes,” which combined public reporting, educational resources, and renewed regulatory enforcement. Gurwitz et al assume that it was this partnership that led to the fall in use of antipsychotic medications.
But that’s not the whole story.
If we look at why the use of antipsychotic medications began to rise again in the 1990s, what we see is a massive push by Big Pharma to peddle these drugs to nursing homes, even though they are not FDA approved for the treatment of the symptoms of dementia. Not only have studies failed to demonstrate that the antipsychotics (whether “typical” antipsychotics such as haloperidol or the “atypicals” such as risperidone) work in dementia, but the FDA also issued a black box warning indicating that they have been associated with sudden death. The drug companies were undeterred. They employed various strategies to achieve spectacular sales of atypical antipsychotics in the nursing home.
Janssen, a subsidiary of the mega-company Johnson &Johnson, went so far as to create what it called “ElderForce,” a special group of drug reps who were deployed to market the antipsychotic Risperdal (risperidone) to doctors in nursing homes. Now it’s perfectly legal for doctors to prescribe an FDA-approved drug “off label,” that is, for some other non-approved use. But it’s not legal to advertise drugs for non-FDA-approved indications. What Janssen did was to pay its ElderForce reps a commission for every prescription the doctors wrote. J&J was not alone in promoting antipsychotics to nursing home physicians for use in their troublesome patients with dementia. Eli Lilly did the same for its atypical antipsychotic, Zyprexa (olanzapine). It was evidently a winning strategy: Astra-Zeneca followed suit with its drug, Seroquel, and, not to be left out, Bristol-Myers-Squibb tried it with Abilify. The leading distributor of prescription drugs to nursing homes, Omnicare, got a piece of the action when it instructed its pharmacists to provide disinformation to nursing home doctors—in return for a kickback from Abbott, the company that manufactured the drug it was pushing for treating the behavioral symptoms of Alzheimer’s disease, the anti-seizure medication, Depakote (which like the antipsychotics, is not approved for this indication).
Slowly and methodically, the Department of Justice reacted. And what followed was a dramatic series of investigations that ultimately resulted in penalties for the malfeasants. Sometimes the payouts were probably too small to have much of an effect—the $520 million that Astra-Zeneca paid in 2010 to settle charges of illegally marketing Seroquel (quetiapine) in nursing homes could be viewed as just the cost of doing business. But even for Eli Lilly, the $1.4 billion it paid to settle civil and criminal charges relating to the marketing of Zyprexa  (olanzapine) was substantial. And when Johnson&Johnson paid $2.2 billion in criminal and civil fines in 2013 to settle accusations that it improperly promoted Risperdal (risperidone) for use in nursing home residents, all the drug companies took notice.
So yes, I think CMS is onto something when it acknowledges that the problem of the overuse of antipsychotics in nursing homes is multifactorial, and it’s right to look to nursing home chains and physicians, as well as to educational tools and regulatory incentives in its quest for reform. But let’s not forget that one of the “stakeholders” is the drug companies and that the legal system can be a powerful change agent.

June 11, 2017

Parachuting through Life

Last week I saw the play “Ripcord” at the Huntington Theater in Boston, a hilarious comedy by David Lindsay-Abaire, and one of the rare plays that features life among the older set. Ignore the unrealistic depiction of assisted living—the playwright does not seem to distinguish between assisted living facilities and nursing homes—or the mischaracterization of who live in assisted living—the play features two women who are entirely too vigorous to require assisted living, let alone the nursing-home-like facility in Ripcord. It’s nonetheless a vivid, if exaggerated portrait of some of the poignant struggles of older life. Ripcord introduces us to two of the zaniest and most memorable elderly fictional characters in recent memory, Abby and Marilyn, forced by circumstance to become roommates.

Both Abby and Marilyn, in their own very different ways, need to come to terms with troubled relationships. Marilyn was married to an alcoholic and perhaps abusive man; Abby’s only son is a drug addict from whom she has long been estranged. Both women find themselves in a new phase of life and have to adapt to straitened circumstances, a task that Marilyn performs with grace and Abby with vitriol. But redemption comes for both of them, as Marilyn’s ability to see the good in everyone, from the aide at the facility to her lugubrious roommate finally rubs off on Abby, and Abby’s insistence on telling-it-like-it-is allows Marilyn to acknowledge and accept the flaws in her marriage.

In its eccentric and sometimes over-the-top fashion—the “ripcord” of the title refers to the cord the two older women must pull to open their parachute while skydiving—this play brings to life one of the major insights of contemporary geriatrics: at least as important as pills and procedures for a good quality of life in old age is a robust social network. In the end, it is not fame or fortune that mark a life as having been worth living, but the relationships we forge with others.

June 04, 2017

One of the major mile posts in biomedical ethics was the passage of legislation that today’s medical residents—and I daresay most Americans—have never heard of, the Patient Self-Determination Act of 1990. Certainly all those who inveighed against “death panels” and who balked at the idea that the Affordable Care Act might include provisions allowing Medicare to reimburse for advance planning conversations never heard of it. This act, as its title indicates, was intended to put patients in the driver’s seat, to allow them to weigh in on the approach to medical care they would get at the end of life, even if they were unable to participate in decisions at the time those decisions need to be made. It officially sanctioned advance directives by enshrining them in federal law. 

What the PSDA says is that any health care institution receiving government funds, which is to say virtually all health care institutions, is obligated to ask patients if they have an advance directive, to offer them the opportunity to complete one if they don't, and to prominently display a copy of that directive in the medical record if they do. The PSDA put advance care planning on the map. It also put advance care planning squarely in the legal domain and that, as the authors of a new article in the New England Journal assert, was a big mistake.

What “Delegalizing Advance Directives—Facilitating Advance Care Planning” argues is that a major reason that advance directives haven’t caught on is that they typically have to be signed by two witnesses (or a notary), and in some states (North Carolina, South Carolina Virginia, and Missouri) a notary; some states also require use of a specified form. I agree that these requirements are an impediment to widespread use of advance directives. I agree that the POLST (Physician Orders for Life-Sustaining Treatment) model, which uses a medical order rather than a legal document and just requires the signature of the patient and the physician, puts advance care planning unambiguously within the medical sphere. But I don’t think that simply allowing patients or prospective patients to designate a health care agent, or surrogate decision-maker, without use of a mandated form and without witnesses would solve the problem. 

The real problem is not just that people don’t bother with the forms and that the forms don’t always make it into the hands of clinicians. The real problem, as Charlie Sabatino of the American Bar Association put it in a 2010 article, is that advance directives are based on a transactional view of advance care planning rather than a communications model. And what we now understand is that advance care planning has to be founded on dialogue between a clinician and a patient.

The problem with advance directives is not that they have to be witnessed or written on special forms. If that was the problem, we’d expect to see much higher utilization rates in Idaho, where there are no witness or notary requirements, and somewhat higher utilization rates in Utah, where only one witness is required. The problem is that they reduce advance care planning to completing a form, to checking off boxes on a list. 

In Idaho, for example, individuals have the opportunity to say that if they are ever unable to communicate and “have an incurable or irreversible injury, disease, illness or condition,” and that a medical doctor, based on a physical examination, has concluded that the “injury, disease, illness or condition is terminal” and that “death is imminent” no matter what is done, and that the “application of artificial life-sustaining procedures” would only artificially prolong life, then they would (or would not) want medical treatment and procedures, nutrition and hydration, hydration but not nutrition…

The lawyers who design such forms believe they allow people to indicate with great precision just what they would want and under what circumstances they would want it. But in fact, as many others have observed, it is far from clear what exactly it means for a condition to be terminal. I would argue that dementia is a terminal disease—but with a typical time course of about five years from the time of diagnosis to death. Ah, you might say, but the advance directive forms include the qualifier that death must be imminent. But that's not good enough. How imminent? In a matter of hours? Days? Months? And if we could agree, based on a careful reading of the text of the directive (which I’m not so sure we can), that what is meant is the person has a  disease that in the normal course of things results in death within six months and that the person's disease has progressed to the point where death will occur within at most days; then is it really so useful to specify that in those very limited circumstances we wouldn’t want treatment that won’t make a difference anyway? Is that all the advance care planning is about—stopping futile treatment in the 72 hours of life? And what about “treatment” that is symptomatic, that is intended to ameliorate suffering, rather than to prolong life, though it might, as an unintended consequence, prolong life. Are such “medical treatment[s] and procedures” to be rejected?

Advance care planning, as we have come to understand it over the last several decades, is not about procedures or treatments—or checking boxes. It is about reflecting on what’s important, in the context of a realistic understanding of a person’s medical condition. It's about figuring out what medical treatments are most consistent with achieving whatever it is that the patient deems important in life. 

Making it easier to complete a form will not transform advance directives. Conceptually, advance directives are legal documents, whether or not they must be witnessed or notarized or completed on special paper. What people need is not a better document. It’s a different process, a process that is built on communication and that deals with the purpose of medical treatment, not the technical means of achieving those ends.