Showing posts with label death. Show all posts
Showing posts with label death. Show all posts

January 01, 2021

Looking Forward

            Once the 1918-1919 influenza pandemic finally came to an end—after killing somewhere between 50 and 100 million people worldwide—Americans did their best to forget about it. Later tragedies such as AIDS and 9/11 figured prominently in much American fiction, but influenza was seemingly forgotten by American writers: Katherine Anne Porter’s short story, “Pale Horse, Pale Rider” and William Maxwell’s novella, “They Came Like Swallows,” are rare exceptions. Historians and journalists writing about the 1918 flu have hypothesized that the pain and suffering inflicted by the flu paled by comparison with that attributable to World War I, which came to an end at the same time, even though ten times more Americans died of the flu than died in combat. Or perhaps Americans were so optimistic about scientific medicine, which was just coming into its own in the twentieth century, that they chose to ignore medicine’s great failure, its inability to diagnose, treat, prevent, or cure influenza. Maybe Americans simply repressed this traumatic episode that killed people in the prime of life, leaving families without a means of support and children without a mother or father. Will the Covid-19 pandemic similarly be forgotten, or will it have a profound and enduring effect on us as individuals and on us as a society?

            The pundits are already speculating about the long-lasting effects of the pandemic on the real estate market and on the work place, on professional conferences and the movie industry. But what I would like to address is the life lessons we should take away from this devastating and unexpected year. The first is that our lives are tenuous. We in the developed world have come to expect a long healthy life, especially if we are white and middle class. Life expectancy at birth in the US is just under 79 years; if you make it to age 65, you can expect to live another 20 years. Covid-19 showed us that we should not take those years for granted: while 80 percent of the Covid deaths have been in people aged 65 or older, that means that 20 percent have been in people under 65. As of the end of December, 2020, 346,000 Americans had died from the disease, which translates to 69,000 younger people. There’s nothing like awareness of our own mortality to concentrate the mind and encourage us to live life well and to the fullest. This is the first lesson and the one we are perhaps most likely to forget.

            The second lesson is that what matters most to us as human beings is our relationships with other people. That’s what made “social distancing” so painful; it’s why eliminating family visits to nursing homes was so devastating; it’s why Zoom, FaceTime, and other video chat programs have been such a lifesaver. We need to cultivate our friendships, to nourish them, to work to improve them. The pandemic made us believe that other people are the enemy, which runs counter to our essence as social creatures.

            The third lesson that I want to emphasize is of a different sort: it is that to make decisions about most anything important and certainly to make medical decisions, we need to understand something about risk. How to behave during the epidemic was all about how to evaluate risk, how to think about risk. Just because most people who don’t wear masks and who go to group gatherings won’t get sick doesn’t mean that these are safe activities. It means that you markedly increase the chance that you will contract the virus if you go around without a mask or attend a group meeting. And understanding risk is more complicated still: how much you increase your risk depends on how widespread the virus is in the surrounding community. If very few people in the vicinity of where you live are sick, then your likelihood of getting the disease is low, even if you fail to take precautions. But as the virus begins to circulate more widely, then precisely the same behavior pattern that was only slightly unsafe before will become far more dangerous. 

            Understanding risk is tricky because the epidemiological measures designed to protect individuals, whether wearing a mask, practicing social distancing, or getting vaccinated, are not perfectly effective. Some people who wear a mask will nonetheless contract the virus; ditto for people who stay six feet away from others. Individuals who received either the Pfizer or Moderna vaccination in the clinical trials were one-twentieth as likely to get sick as those who received a placebo. But that means that just how safe you can feel if you are vaccinated  (even if the effectiveness holds up in a much larger population than was tested in the trials) also depends on how widespread the virus is: while vaccination lowers your relative risk of getting sick, if the number of infectious people in the community suddenly increases, say by a factor of ten, your chance of getting the disease also goes up by a factor of ten, even if you've been vaccinated. Grasping the concept of risk is essential—not just to dealing with an infectious disease, but also to deciding whether to undergo screening for prostate cancer, whether to take medication for borderline high blood pressure, and whether to invest in the stock market. 

           Americans, along with people across most of the globe, have lost much from our encounter with the corona virus. We have also gained something: an appreciation for life’s fragility, a recognition of the importance of relationships, and a deeper understanding of risk. It is up to us to remember, both those we have lost and what we have learned. 

April 01, 2020

Venting About Ventilators

Yesterday, the New York Times published a short article I wrote about what family caregivers can do to try to keep vulnerable older family members safe during the coronavirus epidemic. We as individuals and as a society should do our utmost to keep everyone healthy; my article suggests a few strategies to help those older people who live in the community but need help with personal care or other basic daily functions. 

In many cases, our strategies will succeed, but we have to be realistic and think about the possibility that, despite our best efforts, some older adults—those in their 70s, 80s, or 90s—will get sick. A minority will get so sick that physicians will propose transferring them to the intensive care unit (ICU); most of those brought to the intensive care unit will be breathing so poorly that doctors will advise a ventilator, or breathing machine. 

The popular press makes it sound as though with ICU treatment in general and a ventilator in particular, older patients infected with Covid-19 will live and without this form of treatment, they will die. The reality may be quite different. A report of the experience of nine Seattle-area hospitals just published in the New England Journal of Medicine sheds some light on the question.

The authors report on 24 patients with Covid-19 who were sick enough to be admitted to the ICU. Five of them were over age 80 and five were between 70 and 80. This is a very small sample, but the paper is one of the few published reports that included detailed information about each patient. The outcomes were sobering.

In this group of 10 very sick older Covid-19 patients, 8 died, for a mortality rate of 80 percent. By comparison, among the 14 very sick Covid-19 patients under age 70, 5 died, or 36 percent. A subset of the 24 extremely ill patients received mechanical ventilation—a tube was inserted into their lungs that was connected to a machine that breathed for them. Among the 7 patients over 70 who were both in the ICU and intubated, 6 died, or 86 percent, compared to 4 out of the 11 intubated patients under age 70 (36 percent). The sole case of an older patient with Covid-19 who was intubated and lived was notable for the complete absence of underlying chronic conditions (comorbid conditions, as defined by the study, include asthma, chronic obstructive pulmonary disease, obstructive sleep apnea, infection with human immunodeficiency virus, immunosuppression, diabetes mellitus, chronic kidney disease, and ischemic or hemorrhagic stroke).

An earlier study from China found that among 52 patients admitted to the ICU with Covid-19, the survival rate for people over 70 was 10 percent compared to 45 percent among those under 70.  

Data from the National Health Service in England reporting on the British experience through March 27 found that of 157 patients admitted to an intensive care unit with Covid-19, 73 percent of those aged 70 or older died compared to 35 percent among those under 70. 

In summary, in these three reports, survival rates were low for older patients admitted to the ICU, particularly for anyone who was put on a ventilator. That doesn’t mean it never happens. But it strongly suggests that if you are over 70 and if, despite all the best efforts at prevention, you do get the virus, and if you are one of the minority who become extremely ill with the infection, the outlook is poor. 

Many though by no means all people, if they know the end is likely to be near, do not want aggressive medical treatment that offers little or no benefit. This goes for people with advanced cancer, severe heart disease, or any of a variety of other conditions that are usually fatal. They’d rather receive medications such as morphine to ease their shortness of breath and medications such as lorazepam to ease their anxiety than to undergo extremely uncomfortable treatment that has only a small chance of prolonging their lives. Severe Covid-19 is another condition for the oldest Americans to consider adding to the list.

We all hope we won’t get the virus and that if we do get it, we’ll have a mild case. We hope that if we have a more serious case, we won’t be sick enough for doctors to propose transferring us to the ICU and using a ventilator. But if you are over 70 and you become severely ill with Covid-19, you will be facing a situation that may be as dire as advanced cancer. To be sure, if you survive the coronavirus infection, you might have a good quality of life (though this, too, is uncertain as we know little about life-after-the-virus for those who have been in an ICU) and you might live for some time. If you benefit from treatment of advanced cancer, on the other hand, the benefit may be short-lived. But in both cases, you have a choice. You can decide that you want any and all treatments, however burdensome and however likely or unlikely they are to improve your condition. Or you can opt for a more palliative approach. You don’t have to accept treatment that you regard as excessively burdensome. You don’t have to spend what might be—but might not be—your last days in an ICU with a machine breathing for you, unable to eat or speak. You can choose instead to be treated with intravenous fluids, oxygen, assorted medications, and other forms of supportive care but to decline admission to an ICU and intubation. Your general state of health (before coming down with a coronavirus infection) and your personal preferences should guide your decision.

Most people with Covid-19 infections do not become so desperately ill that they are admitted to an ICU and intubated. Specifying in advance whether you would want this kind of treatment by signing a simple advance directive and discussing your wishes with your health care proxy is a type of insurance policy. Like flood insurance and fire insurance, you hope you will never need to make use of it. But it’s good to have it, just in case.

August 19, 2019

Loss

There will be no more email messages with requests for data about the median age of legislators in Western European countries. No emails with provocative observations about the association between the rise in the overall suicide rate and the growing legalization of physician assisted suicide. They all began with “Dear Muriel,” never with “Hi Muriel,” never without a salutation. He is gone, I learned in July. Died just before his 88thbirthday. It was the emphysema that got him in the end.

I realized last night that I haven’t written a single blog post since then. For some time, I’ve been finding it difficult to identify new findings in medicine worth writing about, that is, sufficiently interesting to me to write about them. Vitamin D is the panacea for aging; Vitamin D is out—useless, or worse. Calcium prevents osteoporosis; calcium doesn’t prevent osteoporosis. At long last, Congress passes legislation supporting caregivers; the new legislation won’t achieve much of anything. There’s a new, promising test to identify pre-clinical Alzheimer’s disease; there’s no point taking the test unless you want to enter a research study—or if you want to make yourself miserable sooner than necessary. None of this seemed to matter enough for me to write something about it. And now it matters even less because there is a hole in the fabric of the universe.

For years, I’ve been writing about meaning in old age: the importance of figuring out ways to remain engaged with the world despite age-associated limitations, despite encroaching frailty. I’ve consistently seen the role of geriatrics as facilitating a good old age, where 'good' implies a time to cultivate relationships and contributing to the net goodness in the world. Geriatrics is the means to an end, not the end itself. Sure, preventing or reversing frailty would be nice, but what is even more critical is adapting to whatever life has in store for us, and that's the domain where geriatrics, like palliative care, can make contributions. I’ve also written about accepting mortality, which applies both to people who themselves are facing the end of life and to those who care about them. But I don’t think I’ve said anything about how to cope with loss after death.

I suppose that as a physician, I’ve seen my role as ending when life ends. Dealing with what comes next, whether for the person who has died or for everyone else, that’s someone else’s domain. That’s for religion or psychology or social work. But now I’m facing a hole. Yes, I recognize that life is finite. Yes, birth and death, growth and loss are all natural, normal. But we humans, we are meaning-makers. We need to make meaning out of life even where there isn’t any. We do this with rituals, with ceremonies, with reminiscences. We immortalize the mortal through our memories. So, I will do what I usually do when confronted with something that I see as important: I will write. My writing will not patch the hole, but perhaps it will serve as a sort of ornamental curtain.

Rest in peace, my friend.


Daniel Callahan, pre-eminent bioethicist and a true mensch, died on July 16, 2019.

March 17, 2019

What Does Dying Have to Do With It?

What Katy Butler gets spectacularly right in her new book, The Art of Dying Well, is that if we want life's last chapter to be a good one, there’s a great deal more to talk about than death and dying. She understands, which so many writers about aging do not, that maintaining function—the ability to walk, to see, to hear, and a host of other verbs describing the actions that are critical for a fulfilling life—is of paramount importance in this phase of life. She understands that medical tests, procedures, and treatments often do more harm than good and this danger becomes greater as the number of underlying medical problems grows, which happens more and more often with advancing age. So why, then, does she call her book the “art of dying well?”

At first, I speculated that the title had been chosen by the publisher’s marketing division, as often happens, chosen perhaps because books about dying are in vogue, or at least more so than are books about frailty or chronic disease. Then I wondered whether the problem was merely semantic—after all, the formative experience that awakened Butler to the issue of “dying well” was that of her father, which she poignantly describes in her previous book, “Knocking on Heaven’s Door.” Her father had a stroke, only to spend the next seven or so years declining, his life prolonged by medical technology such as a pacemaker. From his daughter’s point of view, that entire period of decline could be viewed as “dying,” even though it was measured in years, not days or months. But Butler says that her goal in her new book is to provide readers with “a step-by-step guide to remaining as healthy and happy as possible, and as medically and unafraid, through the predictable health stages of late life, from vigorous old age to final breath.” Although I would argue with the implication that everyone goes through “predictable health stages”—some people plunge headlong into frailty, for example, whereas others move towards it gradually and others go directly from being robust to dying with virtually no time between the two—she does acknowledge that there’s more to old age than dying. In the very next breath, however, she says that “the goal of each chapter is to help you thrive and keep you on a path to a good end of life.” In other words, a major part of the point is to act today to assure a good death tomorrow. I would emphasize optimizing each day, rather than assuming that the purpose of your behavior today is to prevent a bad death.
The same phenomenon of grasping what old age is all about but not quite getting it is evident in Butler’s misconception about the “goals of care.” In Chapter 4, “Awareness of Mortality,” she asserts that discussing the goals of care is “medical shorthand for exploring what matters most to you [yes!], and how medicine can help you accomplish it [yes!], when time is short and cure is not in the cards [no!]” I think that patients and their physicians need to clarify the goals of care at every stage of life, not just when the end is near. It’s true that most people who are vigorous and are not afflicted with a fatal illness will choose life-prolongation as their main goal. But it is also true that many people who suffer from multiple chronic conditions but who do not have a terminal diagnosis and who can anticipate another ten years of life may choose as their principal goal of care “maximizing function.” Butler is right that for some physicians, discussing the goals of care is a euphemism for moving from treatment that seeks to cure to treatment that seeks to comfort--but goals of care discussions ought to be far more than that.
Then there's Butler's curious discussion of why you should cultivate a network of friendships in old age and find ways to remain engaged with life. Both are decidedly beneficial, as Butler asserts, but not just because they will prove useful “later on.” Relationships and engagement are ways to find meaning in life after the children have grown up and moved away and after retirement. This is yet another instance of the author seeming to understand what’s important as people age but then backsliding into thinking it’s important only as a means to assuring a good death. Befriending your neighbor can be rewarding in and of itself, not just so she will buy groceries for you when you are too ill to do so yourself.
Butler does an admirable job of conveying some of the main insights of geriatrics and palliative care. She understands, for instance, that the hospital is often a perilous environment for an older person, leading to loss of some of the functions most critical to remaining independent. She recognizes that physicians often focus on the benefits of medical technology, whether an implanted cardiac defibrillator (ICD) or an artificial heart valve, and fail to consider their risks. She rightly identifies home care programs, advance care planning, and enrollment in hospice as potentially life-enhancing strategies. But then she makes statements about medicine that are at best misleading and at worst simply wrong. For example, she says that “Benadryl and the sleeping pills are…anticholinergics, an insidious group of commonly prescribed drugs that befuddle thinking and substantially increase the likelihood of developing dementia.” Yes, anticholinergics can result in delirium, a form of acute, reversible confusion. But dementia? 
Butler goes on, a few pages later, to report on a “landmark study” that found that people who used anticholinergics heavily were 50 percent more likely than those who took few to develop dementia. What she doesn’t say is that it’s very misleading to cite relative risk rather than absolute risk: going from a risk of 1 in a 100 to a risk of 1.5 in a 100 constitutes a 50 percent increase in risk, but the outcome in question remains very rare. She doesn’t say that this study lumped many different medications with anticholinergic activity together, including a variety of drugs that are no longer in widespread use, such as the tricyclic antidepressants. She also does not mention that drugs that block acid production (the proton pump inhibitors such as Prilosec and Prevacid) have also been associated, statistically, with developing dementia, and so have anti-anxiety agents. Is it really the case that all these drugs “cause” dementia? Or might it be that people who take certain kinds of drugs—perhaps because they are already exhibiting the earliest signs of dementia—are more likely to go on and develop the full-blown disease? Before jumping to conclusions, observational studies of the kind Butler cites (as opposed to a randomized controlled trial) need to be replicated or, ideally, followed up with a study in which some people are given anticholinergics for a given condition and others, chosen at random, are given something else. 
To be fair, the author of the “landmark study” has gone on to carry out many other observational studies. Her most recent report on this subject appeared in the British Medical Journal last year partially confirmed her earlier findings. But expert analysis of this paper is mixed. At best, it is reasonable to conclude that anticholinergic medications might be a risk factor for dementia.
The Art of Dying Well has much to offer. I only wish it had been more scrupulously reviewed by geriatricians before publication.

July 08, 2018

Where We Die

Honoring patient preferences, which is shorthand for providing an approach to medical treatment consistent with what patients say they want, has become a fundamental tenet of American medicine. And one preference that the vast majority of Americans share, according to multiple studies, is the wish to die at home. 

Whether dying at home is actually what patients want when they are faced with impending death, rather than an abstract preference expressed when they are healthy, is another matter—and I’ve previously argued that as hospital-based palliative care improves and home-based palliative care places an ever-growing burden on families, hospitals have become more attractive as a site for dying. But what patients tend to agree on is that they don’t want to suffer as they are dying.

To the extent that hospital care entails interventions such as ventilators or ICU treatment or chemotherapy, patients are reluctant to subject themselves to this type of care, especially if its likelihood of prolonging life is remote. To the extent that fewer hospital deaths and more home deaths is a marker for less suffering at the end of life, change in this direction is very desirable. A new study by Teno and colleagues shows we are continuing to make progress in this direction.

A few years ago, Teno et al performed a similar study comparing the experience of patients in 2000 to the comparable experience in 2009. What they found then was a marked decrease in the percentage of elderly Medicare fee- for-service decedents (health-policy-speak for people over age 65 with conventional Medicare who died) who expired in the hospital (32.6 percent vs 24.6 percent). Over the same period, however, they found ICU use increased among decedents in the last month of life (from 24.3 percent to 29.2 percent) as did the percentage of dying patients who underwent a transition of care (nursing home to hospital, for example) in the last 3 days of life (10.3 percent to 14.2 percent). The current study updates these findings by extending the period of analysis to 2015 and by adding data from older patients enrolled in Medicare Advantage programs, who now account for 30 percent of the Medicare population.

What they discovered today was that the proportion of hospital deaths among the fee-for-service group has continued to fall, going from 32.6 percent in 2000 to 24.6 percent in 2009 to 19.8 percent in 2015. ICU use in the last 30 days of life, which had risen between 2000 and 2009, remained stable at the 2009 level in 2015. Transitions to another site of care in the last 3 days of life, which had also risen between 2000 and 2009, went back down in 2015 to the same level as in 2000. And the chance of being enrolled in hospice at the time of death rose from 21.6 percent in 2000 to 50.4 percent in 2015. When the investigators looked at a sample of Medicare Advantage patients, they found these individuals had the same experience in 2015 as their fee-for-service counterparts.

What does all this mean? I suspect what it means is that when we know with a high degree of certainty that someone is going to die in the very near future, we tend to focus on comfort care. If physicians, patients, and families recognize that death is imminent, hospitalization is relatively unlikely, as is ICU care or transfers from home or nursing home to another site of care. However, physicians often cannot be so certain that death is likely to occur in the next few weeks or months. As long as the usual strategy is to pursue maximally aggressive care until death is virtually sure to occur in the immediate future, and then to abruptly transition to care focused exclusively on comfort, the picture we see today is likely to continue.

There is another approach. That approach involves opting for a goal that is neither comfort only nor life-prolongation at any cost. Instead, maximization of function is paramount; treatment aimed at prolonging life is also acceptable, provided it will not affect quality of life in a major way. So, too, is comfort a goal, but only to the extent that it does not conflict with maintaining function. For people who are frail, extremely old, or both this alternative strategy translates into fewer hospitalizations, fewer ICU stays, and fewer transitions of care in the final stage of life, whether that period is measured in weeks, months, or even years. 

Medical treatment does not have to be all or none; there is something in between. It’s quite possible that many people would opt for this type of care—if only they knew it existed.

August 27, 2017

A Shot of Irish Whiskey

I recently stumbled across The Way We Die Now in the new books section of my local library. I hadn’t heard of it or its author, the Irish gastroenterologist Seamus O’Mahony, and I couldn’t find any reviews in American publications. Intrigued, I checked it out. It’s one of the more insightful—and simultaneously annoying—-of the  long list of books about death and dying.
The author makes several observations that are worthy thinking about. First, he says that dying is inherently messy and distressing; our attempts to sanitize it with what he calls the “syringe-driver” (Britishese for a “pump,” a way to deliver opioids such as morphine intravenously or subcutaneously or even directly into the central nervous system in a continuous, steady fashion) or with physician assisted suicide are vain attempts at controlling the uncontrollable. Second, and on a related note, he mocks the insistence by some that dying should be an occasion for “personal growth;” there’s nothing uplifting about dying and it is seldom an opportunity for repairing longstanding personal rifts. Third, he derides all self-proclaimed “death experts,” by which he principally means palliative care physicians, although he regards proponents of “narrative medicine” as similarly tainted. While acknowledging some of the contributions of palliative medicine, such as better pain control and the development of inpatient hospices, he feels strongly that the medical care of the dying should remain in the hands of primary care physicians.
Death is messy: O’Mahony discusses the writings of several public intellectuals, Philippe Aries, Ernest Becker, and Ivan Illich, on this subject. Aries, writer of the monumental history, Western Attitudes Towards Death, O’Mahony describes as a “romantic reactionary who looked back to an idealized, pre-industrial past” because he yearns for a peaceful death, at home, surrounded by family. Becker, author of The Denial of Death, argues that fear of death is the essence of being human and that each individual must sublimate his or her “fear of extinction with heroic projects designed to transcend death.” And Illich, iconoclastic writer of Medical Nemesis, laments the medicalization of death, exhorting us to “learn to cope” with the external constraints on the human condition.
I also was deeply influenced by these three writers: Aries because the historical perspective helps us see that the way things are today isn’t necessarily the way things have to be; Becker because I agree that the idea of an “immortality project” is tremendously useful for those of us who are aware of our mortality and don’t believe in an afterlife; and Illich because he opened my eyes to the notion that our world is excessively medicalized. Thanks to Aries, I have sections on history in many of my books—including my forthcoming Old and Sick in America, where I use an historical perspective to demonstrate the power of the Medicare program to shape the experience of illness. In homage to Becker, I called my last book, The Denial of Aging. And one of my earliest articles, a critique of the modern nursing home, decries its medicalization. I share with O’Mahony the view that death today is over-medicalized, that we shouldn’t expect to control the exact time and course of our dying, and that the clergy (including secularly oriented chaplains) and social workers are as necessary as medical doctors.
Death isn’t an occasion for personal growth: It’s not just dying that isn’t an opportunity for personal growth. The whole idea of “personal growth” rubs me the wrong way, at least as embodied in the work of Abraham Maslow, with his claims about self-actualization, or the “desire to become more and more what one is, to become everything that one is capable of becoming.” I’ve tended towards the view that each person should make the most of his or her own talents and abilities, and should seek satisfaction by applying those talents and abilities to improve the world. But whether or not self-actualization is every a desirable goal, surely it is too much to ask of someone who is dying that he or she continue to “grow.”
We shouldn’t leave death experts in charge: Palliative care increasingly sees itself as the specialty that is uniquely able to communicate, to break bad news, to help patients fill out advance directives, and to control pain and other symptoms. I am not as cynical as O’Mahony about the importance of communication and the possibility of physicians learning to be better at it—he drips with disdain as he asserts that “one of the more pernicious myths of modern medicine is the notion that a doctor with ‘communication skills’ and a sympathetic manner can somehow magically transmute bad news into something palatable…” Yet at the same time, he acknowledges that the “Liverpool Care Pathway,” an algorithm for caring for dying patients in British hospitals that was done away despite its many successes because of lapses in communication: “poor communication was at the root of virtually all complaints about the LCP.” And I think O’Mahony is mistaken when he ridicules the idea that breaking bad news is a special skill—the issue isn’t that when done well, patients accept the news with good grace; rather it’s that when done badly, patients feel abandoned, frightened, and angry. Finally, I share O’Mahony’s concerns about instructional advance directives—documents that purport to dictate exactly what medical procedure will be done in particular circumstances. He writes that “advance directives perpetuate an illusion that we can control, in minute detail, our treatment of an unpredictable illness at some unknown time in the future.” But to confound advance care planning, which can focus far more broadly on goals rather than on the specifics of treatment, with advance directives, that are either uselessly vague or excessively specific, is profoundly misguided. 
That said, I agree that palliative care principles—a view of the end of life as necessitating far more than just medical care, a belief that patients and families need information and guidance to be provided by a kind and compassionate professional, and a recognition that physical symptoms can often be ameliorated if not ablated---should be an essential part of what all doctors do.
So ignore the misunderstandings about the American health care system—O’Mahony says that nearly half of Americans die in hospice care because insurers discovered that “it saves money” and is seemingly unaware that this is predominantly home hospice, not institutional hospice, and that tthat the work from Mass General showing that early palliative care prolongs life in advanced cancer said nothing about cost. He likewise thinks that “in the US” death with dignity “has become a euphemism for euthanasia” (it hasn’t). Try not to mind the ridicule he heaps on both dignity therapy and narrative medicine, which he accuses of advocating that physicians take on a “quasi-sacerdotal role.” Appreciate instead the nuggets of truth: a physician’s job is the treatment of illness (not spiritual malaise or existential angst); the syringe-driver (or morphine pump) “allows for a softer, less frightening, final agony;” “palliative care should be at the center of what all doctors do;” and physicians should treat patients with kindness, courtesy, and yes, dignity.

August 30, 2016

The Real Advance Planning

Michael Kinsley’s Old Age: A Beginner’s Guide isn’t exactly a guidebook to “life’s last chapter,” as the author promises. The book does talk quite a bit about Parkinson’s disease, even though Kinsley assures us that it isn’t really about Parkinson’s disease, because that’s been Kinsley’s diagnosis for the last 23 years. And his comments about going through “deep brain stimulation,” a surgical technique that can be very helpful to people with Parkinson’s, as well as his discussion of accepting limitations—giving up driving, realizing you’re not going to be promoted—are illuminating. His suggestion that the baby boomers redeem themselves for posterity by erasing the national debt is whacky. But he does deal with something tremendously important, and that is coming up with an immortality project.

I first learned about immortality projects when I read Ernest Becker’s The Denial of Death, which was published in 1973. It had such a profound effect on me that I called my book about aging The Denial of Aging in homage to his. Becker’s point, at least as I remember it, was that it is the awareness of our mortality, more than anything else, that distinguishes us from other mammals. 

Now I don’t know if it’s really true that apes are totally oblivious to the prospect of death. But regardless of whether we are unique in this respect, I do think it’s fair to say that our recognition of our finitude profoundly shapes our existence. Some moral philosophers have even suggested that the prospect of further life extension is bad for us as it would induce a kind of ethical laziness—we would keep on putting off doing good because we figured we’d have plenty of time later. That may be a bit of an exaggeration, but I think there’s truth to the claim that mortality is a great motivator. I don’t think it’s necessary to invoke heaven and hell, some kind of post-mortem day of judgment, to induce people to lead a good life. It’s sufficient to realize that our time on earth is limited: if we want to make something of our lives, we better go ahead and do so. And built into the fabric of our being is a desire to live on after our death, to be remembered, and in that way, to triumph over our mortality. Which is where immortality projects come in.

What Kinsley’s book is about is finding an immortality project. He recommends that the baby boomers undertake a joint project with all the other baby boomers (eradicating the debt), which is more daunting and, in my view, less likely to succeed than embarking on an individual project. But Kinsley’s point is that being diagnosed with a chronic, progressive (and I would add, ultimately fatal) disease brought home to him the recognition that he had better get started. It made him think about what was really important to him—was it material possessions? Was it fame? Or was it something more durable?

Kinsley is telling us is that we need to get cracking. We better define our immortality project, our legacy, and start working on it. For Kinsley, it was the diagnosis of a serious disease that helped him figure out that he ought to have such a project. But for most people, that’s a little late. The real message of his book is not to wait. Don’t wait until you already know what disease is going to kill you. Don’t wait until you have dementia or widely metastatic cancer or advanced heart disease. We’re human: we are mortal and we know it. We should all be working on our legacy for much of our lives, where “legacy” may simply mean being the best person we possibly can be.

July 05, 2015

Shocking News

Much has been written lately about over-treatment of older patients. Only rarely does anyone suggest that older patients are getting too little treatment, but a new study in JAMA does just that. The reality isn't quite so clear.

The treatment is the implantable cardioverter defibrillator (ICD) and the patients are people over the age of 65 who have had a heart attack and are found afterwards to have a weak heart (defined as an ejection fraction less or equal to 35%). These patients are at risk of sudden death, of an irregular heart rhythm such as ventricular tachycardia, and the ICD is designed to deliver an electric shock if that happens, effectively bringing the patients back from death. By looking at the National Cardiovascular Data Registry, which keeps track of heart attack patients, the authors of the article found that only 8.1% of “eligible” patients actually received an ICD. As a result, they claim, the 92% of patients who didn’t get an ICD were more likely to die than their counterparts who did.

This is a surprising finding in light of the persuasive and cogent argument made by Sharon Kaufman in her recent book, Ordinary Medicine: Extraordinary Treatments, Longer Lives, and Where To Draw The Line. Kaufman makes the case that many high tech treatments come to be seen by physicians and patients as normal and necessary once Medicare agrees to pay for them. The end result for many marginally beneficial, burdensome, and expensive treatments, including the ICD, is that patients just can’t say no. If that's true, why are so few older people getting an ICD? 

Now it wouldn't be the first time that ageism or misinformation prevented older people from getting beneficial treatment. Many years ago, patients who were over a certain age were precluded from receiving clot-busting drugs (thrombolytic therapy) because it was widely assumed that in older age groups, the risks outweighed the benefits. It turned out that clot-busting drugs were actually more beneficial in older patients, basically because their heart disease tended to be severe which meant they stood to gain a great deal from treatment. Elevated systolic blood pressure was likewise once assumed to be normal in the geriatric population, or even desirable in order to improve blood flow to the brain. Studies eventually showed that elevated systolic blood pressure, even in older patients, predisposed to stroke and other unfortunate outcomes, and warranted treatment—though the recommendations about just how much blood pressure should be lowered have evolved over time. Is the ICD implantation rate just another case of bias or ignorance at work?

Dr. Robert Hauser of the Minnesota Heart Institute, writing an editorial published alongside this article, blames our fragmented health care system. He speculates that primary care physicians may not realize that their patients were supposed to get an ICD. The fact that there's supposed to be a 40-day waiting period between the onset of the heart attack and implantation of the ICD contributes to the problem. Hauser suggests that the primary care physician is so frazzled and overburdened that he is apt to neglect to send his patient to a cardiologist. Is this the explanation?



It can’t be the whole story. While patients who saw a cardiologist after hospital discharge were more likely to wind up with an ICD than patients who didn’t, only 30% of the patients who saw a cardiologist had an ICD implanted. Recall that 100% of them were, technically speaking, “candidates” for an ICD. So what else is going on?

Hauser hints at another explanation: “It is possible that some older patients may refuse ICD treatment for personal reasons or because comorbidities such as endstage kidney disease or advanced frailty were considered in the decision regarding ICD implantation.” He doesn't accept this explanation as sufficient, rightly recognizing that patients are very likely to accept whatever technological intervention their physician recommends and that shared decision-making, if it takes place at all, is apt to reflect the physician’s preferences as well as the patient’s. So the problem, if it is a problem, must lie with doctors, too. Physicians are not systematically and emphatically recommending ICD implantation to their older patients. Even the most technologically sophisticated academic medical centers only implanted ICDs in 16% of their eligible older patients. But is this a problem that needs fixing, like under-treatment of heart attacks with clot busters and inadequate treatment of high blood pressure in the past?

Dr.  Hauser believes it is, saying “even though the use of ICD for primary prevention may not seem to make as much sense for an 80 year old patient as it does for a patient in his 50s or 60s, an older patient at risk for sudden cardiac death should have the same opportunity to choose potentially lifesaving therapy.” But the benefits of ICD in those over 80 are far from clear. The studies include very few people in this age group. What data there is indicates that there is little if any survival benefit. Moreover, ICDs implanted in older people fire erroneously half the time. That means they deliver a very unpleasant electric shock to the hapless patient. In addition, if the ICD does work as intended, what that means is the abolition of sudden death. 

Maybe, just maybe, the low rate of ICD implantation in older people is a refreshing instance of massive civil disobedience—of both patients and doctors refusing to abide by prevailing clinical guidelines. We all have to die of something. An ICD virtually guarantees that the something will involve a protracted period of decline and suffering. If you had to choose between cancer, Alzheimer’s disease, and sudden death, which would you pick?

March 01, 2015

The Age-Cost Connection

It’s been well known for a long time that the amount Medicare spends on patient care every year increases with age. That’s not entirely surprising—after all, 80-year-olds are in general less healthy than 70-year-olds, so they need and receive more medical care. But is there an age when per capita spending stops going up, or even falls? If you think there is, guess what age that happens. Why? What would cause spending to level off? A recent article in Health Affairs gives some of the answers, at least about the facts.

Examining Medicare data from 2000 to 2011 for fee-for-service beneficiaries, the authors confirm that as recently as 2011, Medicare per capita spending rose with age, peaking at age 96 and then gradually declining. Spending for 96-year-olds averaged $15,145 compared to less than half that, or $7566, for 70-year-olds. What’s really fascinating is that in 2000, the age at which Medicare per capita spending peaked was 92, and it’s been steadily increasing ever since.

Before we can speculate about why, we need to understand what the money is being spent on. The study answers this question as well. For nonagenarians, much of the spending goes to skilled nursing facilities (that doesn’t mean long stay nursing homes, which aren’t covered by Medicare, but rather short term, post-acute or rehabilitative care). This finding doesn’t imply that hospital spending goes down; on the contrary, spending on inpatient hospital services remains a relatively constant share of per capita spending until patients reach their late 90s.

Translation: older people use a lot of medical services. They use more and more until they are close to 100 and that includes hospital care, along with hospice and skilled nursing facility care. Evidently our view of what constitutes reasonably treatment has been shifting over time—we used to think that it was all right to treat octogenerians aggressively, but we drew the line at nonagenerians. Now we’re treating nonagenarians aggressively and drawing the line at centenarians.

There is one bit of promising news, one hint that at least some patients and doctors are thinking twice about subjecting the oldest and frailest to all the technology we can muster. If we look at per capita spending the year people die, we find that Medicare spent $43,000 on 70-year-olds but only $20,000 on centenarians. The difference was due almost entirely to a disparity in hospital use. Apparently, it’s easier to recognize or perhaps to accept that a 100-year-old is dying, and to tailor treatment accordingly, than to accept that a 70- or even a 90-year-old is dying. But once we do acknowledge the inevitable, we restrain our impulse to try to prolong life, whatever the cost both to individual dignity and to the nation’s pocketbook.


Maybe, just maybe, we will come to accept that there is a price to pay for invasive treatment even when death is not quite so imminent, and that a different kind of treatment may be more humane for those who are physically frail or demented, regardless of age.