June 25, 2017

The Worthy and the Unworthy

One of the most illuminating and insightful articles I ever read was written by historian of medicine David Rosner. Entitled “Health Care for the ‘Truly Needy’: Nineteenth Century Origins of the Concept.” I read it when it was first published and I’ve remembered it since—and that was 35 years ago. The nineteenth century concept of the “worthy poor” or “deserving poor,” and its Reaganesque reformulation is sadly reflected in the Republican health care bill revealed today.

Rosner points out that at a time of relative ethnic homogeneity in pre-industrial, pre-Civil War America, the poor were often seen, in the light of Christian teaching, as individuals who would be rewarded with salvation. As an added bonus, the presence of poor people gave the wealthy an opportunity for charity, which would likewise be rewarded. But then, in the second half of the nineteenth century, millions of destitute immigrants arrived on American shores. At the same time, Americans suffered from tremendous economic dislocation related to urbanization. As a result, “a general consensus developed among the native-born equating poverty...sinfulness, and individual failure with foreign birth. Conversely, wealth, American nativity, and material success were equated with righteousness and moral behavior.”

The Surgeon General of the US in 1891, Dr. John Shaw Billings, remembered for introducing the collection and maintenance of “mortality and vital statistics” records, also accepted the notion of a meaningful distinction between the worthy and unworthy poor saying “there is a distinct class of people who are…almost necessarily idle, ignorant, intemperate, and more or less vicious, who are failures…and who for the most part belong to certain races,” by which he meant Catholics, Jews, Irish, Italians, and Eastern Europeans. He accepted the need to provide medical care for this group—but only to prevent the spread of infectious diseases to the remainder of the population.

And then we have Dr. Stephen Smith, another public health giant, who cautioned that medical charity can be “the inlet through which the habit of pauperism first creeps into the poor man’s house.” That is, helping people who are poor fosters dependency and is to be avoided. Remember Romney’s 47 percent? The people who are “dependent on the government” and who should simply “take personal responsibility” for their lives?

After discussing the way that concepts of the worthy and unworthy poor evolved in tandem with the growth of the hospital in the early part of the twentieth century, Rosner concludes by arguing that “although the language used today is significantly different from the angry, moralistic, and class biased rhetoric of the nineteenth-century debates, there is a similarity of meaning and analysis in arguments over definitions of the ‘truly needy, over the proper eligibility criteria for a variety of health programs like Medicaid and Medicare, and for the scope of other social service programs such as food stamps and welfare.” He was writing in 1982, but he could equally well be writing today, as we learn who it is that the Republican senators, or at least those who crafted the latest version of the health care bill, deem worthy. Full-time employees of well-heeled companies are worthy and older people, provided they don't live in nursing homes, are worthy. It's unclear if fetuses are worthy: health plans may be excluded from the insurance exchanges if they cover abortion, but health plans may also be allowed (through a waiver) although they fail to cover maternity care. Everyone else, the senators assume, could purchase health insurance—or better yet, not get sick—if only they had the necessary moral fortitude.

This isn’t how any other democratic nations in the world view health, medical care, or their citizens. They assume that everyone is "worthy" of basic medical care. They regard it as the responsibility of government to promote the health of their citizens, just as it government's responsibility to keep them safe and educated. Tell your senator that  enshrining archaic concepts of worthiness into law by severely restricting access to medical treatment is not the way to keep America great.

June 18, 2017

The Other American Drug Problem

With all the attention paid to the opioid epidemic, another drug overuse problem has gone relatively unnoticed--the widespread use of antipsychotic medications in nursing home residents. A perspective article in JAMA this week focuses on this other drug problem—and an intervention that the authors think might just have solved it.
Interestingly, antipsychotic medications were a problem in an earlier era. Then along came OBRA87, or the Nursing Home Reform Act, mandating a variety of strategies limiting the use of drugs to sedate patients with dementia who had behavioral problems: nursing home patients were to be free of “chemical restraints;” staff were supposed to try non-pharmaceutical approaches before resorting to drugs, and they were expected to taper the medication after several months. The regulations seemed to be effective: the percent of nursing home residents receiving an antipsychotic fell from 34 percent pre-OBRA to 16 percent several years afterwards.
But after the atypical antipsychotics were introduced in the early 1990s, beginning with risperidone and then going on to a variety of other agents such as quetiapine and olanzapine, the rate of use began climbing again. By 2011, it had reached 24 percent among nursing home residents. Today, however, it’s back down to its historic low of 16 percent.
In their article, Gurwitz et al regard the turning point as the Office of Inspector General report of 2011, “Medicare Atypical Antipsychotic Drug Claims for Elderly Nursing Home Residents.” In response to this alarming report, the Centers for Medicare and Medicaid Services (CMS) developed a multi-pronged strategy to combat the problem. It launched its “National Partnership to Improve Dementia Care in Nursing Homes,” which combined public reporting, educational resources, and renewed regulatory enforcement. Gurwitz et al assume that it was this partnership that led to the fall in use of antipsychotic medications.
But that’s not the whole story.
If we look at why the use of antipsychotic medications began to rise again in the 1990s, what we see is a massive push by Big Pharma to peddle these drugs to nursing homes, even though they are not FDA approved for the treatment of the symptoms of dementia. Not only have studies failed to demonstrate that the antipsychotics (whether “typical” antipsychotics such as haloperidol or the “atypicals” such as risperidone) work in dementia, but the FDA also issued a black box warning indicating that they have been associated with sudden death. The drug companies were undeterred. They employed various strategies to achieve spectacular sales of atypical antipsychotics in the nursing home.
Janssen, a subsidiary of the mega-company Johnson &Johnson, went so far as to create what it called “ElderForce,” a special group of drug reps who were deployed to market the antipsychotic Risperdal (risperidone) to doctors in nursing homes. Now it’s perfectly legal for doctors to prescribe an FDA-approved drug “off label,” that is, for some other non-approved use. But it’s not legal to advertise drugs for non-FDA-approved indications. What Janssen did was to pay its ElderForce reps a commission for every prescription the doctors wrote. J&J was not alone in promoting antipsychotics to nursing home physicians for use in their troublesome patients with dementia. Eli Lilly did the same for its atypical antipsychotic, Zyprexa (olanzapine). It was evidently a winning strategy: Astra-Zeneca followed suit with its drug, Seroquel, and, not to be left out, Bristol-Myers-Squibb tried it with Abilify. The leading distributor of prescription drugs to nursing homes, Omnicare, got a piece of the action when it instructed its pharmacists to provide disinformation to nursing home doctors—in return for a kickback from Abbott, the company that manufactured the drug it was pushing for treating the behavioral symptoms of Alzheimer’s disease, the anti-seizure medication, Depakote (which like the antipsychotics, is not approved for this indication).
Slowly and methodically, the Department of Justice reacted. And what followed was a dramatic series of investigations that ultimately resulted in penalties for the malfeasants. Sometimes the payouts were probably too small to have much of an effect—the $520 million that Astra-Zeneca paid in 2010 to settle charges of illegally marketing Seroquel (quetiapine) in nursing homes could be viewed as just the cost of doing business. But even for Eli Lilly, the $1.4 billion it paid to settle civil and criminal charges relating to the marketing of Zyprexa  (olanzapine) was substantial. And when Johnson&Johnson paid $2.2 billion in criminal and civil fines in 2013 to settle accusations that it improperly promoted Risperdal (risperidone) for use in nursing home residents, all the drug companies took notice.
So yes, I think CMS is onto something when it acknowledges that the problem of the overuse of antipsychotics in nursing homes is multifactorial, and it’s right to look to nursing home chains and physicians, as well as to educational tools and regulatory incentives in its quest for reform. But let’s not forget that one of the “stakeholders” is the drug companies and that the legal system can be a powerful change agent.
-->

June 11, 2017

Parachuting through Life

Last week I saw the play “Ripcord” at the Huntington Theater in Boston, a hilarious comedy by David Lindsay-Abaire, and one of the rare plays that features life among the older set. Ignore the unrealistic depiction of assisted living—the playwright does not seem to distinguish between assisted living facilities and nursing homes—or the mischaracterization of who live in assisted living—the play features two women who are entirely too vigorous to require assisted living, let alone the nursing-home-like facility in Ripcord. It’s nonetheless a vivid, if exaggerated portrait of some of the poignant struggles of older life. Ripcord introduces us to two of the zaniest and most memorable elderly fictional characters in recent memory, Abby and Marilyn, forced by circumstance to become roommates.

Both Abby and Marilyn, in their own very different ways, need to come to terms with troubled relationships. Marilyn was married to an alcoholic and perhaps abusive man; Abby’s only son is a drug addict from whom she has long been estranged. Both women find themselves in a new phase of life and have to adapt to straitened circumstances, a task that Marilyn performs with grace and Abby with vitriol. But redemption comes for both of them, as Marilyn’s ability to see the good in everyone, from the aide at the facility to her lugubrious roommate finally rubs off on Abby, and Abby’s insistence on telling-it-like-it-is allows Marilyn to acknowledge and accept the flaws in her marriage.

-->
In its eccentric and sometimes over-the-top fashion—the “ripcord” of the title refers to the cord the two older women must pull to open their parachute while skydiving—this play brings to life one of the major insights of contemporary geriatrics: at least as important as pills and procedures for a good quality of life in old age is a robust social network. In the end, it is not fame or fortune that mark a life as having been worth living, but the relationships we forge with others.
-->

June 04, 2017

One of the major mile posts in biomedical ethics was the passage of legislation that today’s medical residents—and I daresay most Americans—have never heard of, the Patient Self-Determination Act of 1990. Certainly all those who inveighed against “death panels” and who balked at the idea that the Affordable Care Act might include provisions allowing Medicare to reimburse for advance planning conversations never heard of it. This act, as its title indicates, was intended to put patients in the driver’s seat, to allow them to weigh in on the approach to medical care they would get at the end of life, even if they were unable to participate in decisions at the time those decisions need to be made. It officially sanctioned advance directives by enshrining them in federal law. 

What the PSDA says is that any health care institution receiving government funds, which is to say virtually all health care institutions, is obligated to ask patients if they have an advance directive, to offer them the opportunity to complete one if they don't, and to prominently display a copy of that directive in the medical record if they do. The PSDA put advance care planning on the map. It also put advance care planning squarely in the legal domain and that, as the authors of a new article in the New England Journal assert, was a big mistake.

What “Delegalizing Advance Directives—Facilitating Advance Care Planning” argues is that a major reason that advance directives haven’t caught on is that they typically have to be signed by two witnesses (or a notary), and in some states (North Carolina, South Carolina Virginia, and Missouri) a notary; some states also require use of a specified form. I agree that these requirements are an impediment to widespread use of advance directives. I agree that the POLST (Physician Orders for Life-Sustaining Treatment) model, which uses a medical order rather than a legal document and just requires the signature of the patient and the physician, puts advance care planning unambiguously within the medical sphere. But I don’t think that simply allowing patients or prospective patients to designate a health care agent, or surrogate decision-maker, without use of a mandated form and without witnesses would solve the problem. 

The real problem is not just that people don’t bother with the forms and that the forms don’t always make it into the hands of clinicians. The real problem, as Charlie Sabatino of the American Bar Association put it in a 2010 article, is that advance directives are based on a transactional view of advance care planning rather than a communications model. And what we now understand is that advance care planning has to be founded on dialogue between a clinician and a patient.

The problem with advance directives is not that they have to be witnessed or written on special forms. If that was the problem, we’d expect to see much higher utilization rates in Idaho, where there are no witness or notary requirements, and somewhat higher utilization rates in Utah, where only one witness is required. The problem is that they reduce advance care planning to completing a form, to checking off boxes on a list. 

In Idaho, for example, individuals have the opportunity to say that if they are ever unable to communicate and “have an incurable or irreversible injury, disease, illness or condition,” and that a medical doctor, based on a physical examination, has concluded that the “injury, disease, illness or condition is terminal” and that “death is imminent” no matter what is done, and that the “application of artificial life-sustaining procedures” would only artificially prolong life, then they would (or would not) want medical treatment and procedures, nutrition and hydration, hydration but not nutrition…

The lawyers who design such forms believe they allow people to indicate with great precision just what they would want and under what circumstances they would want it. But in fact, as many others have observed, it is far from clear what exactly it means for a condition to be terminal. I would argue that dementia is a terminal disease—but with a typical time course of about five years from the time of diagnosis to death. Ah, you might say, but the advance directive forms include the qualifier that death must be imminent. But that's not good enough. How imminent? In a matter of hours? Days? Months? And if we could agree, based on a careful reading of the text of the directive (which I’m not so sure we can), that what is meant is the person has a  disease that in the normal course of things results in death within six months and that the person's disease has progressed to the point where death will occur within at most days; then is it really so useful to specify that in those very limited circumstances we wouldn’t want treatment that won’t make a difference anyway? Is that all the advance care planning is about—stopping futile treatment in the 72 hours of life? And what about “treatment” that is symptomatic, that is intended to ameliorate suffering, rather than to prolong life, though it might, as an unintended consequence, prolong life. Are such “medical treatment[s] and procedures” to be rejected?

Advance care planning, as we have come to understand it over the last several decades, is not about procedures or treatments—or checking boxes. It is about reflecting on what’s important, in the context of a realistic understanding of a person’s medical condition. It's about figuring out what medical treatments are most consistent with achieving whatever it is that the patient deems important in life. 

Making it easier to complete a form will not transform advance directives. Conceptually, advance directives are legal documents, whether or not they must be witnessed or notarized or completed on special paper. What people need is not a better document. It’s a different process, a process that is built on communication and that deals with the purpose of medical treatment, not the technical means of achieving those ends.

-->