December 21, 2021
September 14, 2021
This weekend, the New York Times uncovered new, seemingly damaging information about nursing homes, this time unrelated to their mishandling of the Covid-19 epidemic. “Phony Diagnoses Hide High Rates of Drugging at Nursing Homes,” is a detailed investigative essay by three reporters that reveals that nursing homes regularly under-report the frequency with which they prescribe sedating, antipsychotic medications for residents with dementia. Such medications, while useful for controlling paranoia and delusions (which may afflict people with dementia), have not been shown to be more generally helpful in controlling the behavioral symptoms of dementia. They are, however, associated with a two-fold increase in mortality and other adverse effects. As a result, nursing homes have been under pressure for years to limit their use of “chemical restraints,” medications that suppress agitation--including general sedatives and antipsychotics.
The campaign against the use of anti-psychotics in dementia began with the Nursing Home Reform Act of 1987 (OBRA-8)7, legislation asserting that residents have the right to be free from physical and chemical restraints that are “not required to treat specific medical symptoms.” Then, nearly a decade ago, CMS announced a new approach, the National Partnership to Improve Dementia Care and Reduce Unnecessary Antipsychotic Use in Nursing Homes. This strategy provided for training modules for nursing home staff on how to handle residents with dementia; the strategy also involved including as a “quality indicator” the proportion of long-stay nursing home residents receiving an antipsychotic medication. In 2015, this quality indicator was added to the list of measures that comprise the overall rating of nursing homes that CMS publishes on its website, Nursing Home Compare.
The last 34 years have seen a marked decline in the use of antipsychotic medications in nursing home residents. The Nursing Home Reform Act of 1987 led to a 27 percent reduction in antipsychotic use. The introduction of psychoactive drug use as a quality indicator led to a further ten percent fall in prescribing. Today, CMS statistics assert that 15 percent of long-stay nursing home residents regularly receive antipsychotics for some problem other than Tourette’s syndrome, schizophrenia, or Huntington’s Chorea, for which antipsychotics are approved. The new report by the NY Times suggests that the correct figure is more like 21 percent, with the excess accounted for by bogus diagnoses of schizophrenia: the implication is that doctors want to control their demented patients by sedating them, but are discouraged from doing so by CMS regulations, so they get around the rules by falsely labeling their patients as schizophrenic.
The Times argues that “caring for dementia patients is time- and labor-intensive. Workers need to be trained to handle challenging behaviors like wandering and aggression. But many nursing homes are chronically understaffed and do not pay enough to retain employees, especially the nursing assistants who provide the bulk of residents’ daily care.” All true. The Times goes on to argue that nursing homes with poor staffing ratios (facilities that get a 1 or 2 star rating for the adequacy of staff: patient ratios) dole out more antipsychotic medications than those with better staffing ratios (facilities with 4 or 5 star ratings for staffing).
While this graph strongly suggests an inverse relationship between staffing and antipsychotic use, what is equally striking is that all the facilities, regardless of staffing, administer antipsychotics to upwards of 15 percent of residents, without appropriate justification. What this suggests, contrary to the NYT implication, is that nursing homes find that no amount of training and no number of staff members can consistently and reliably treat all the behavioral symptoms of dementia.
The unfortunate reality is that we know very little about how to care for people with moderate to advanced dementia who exhibit problematic behaviors. The behaviors I am referring to are not merely inconvenient to staff, such as “wandering” off the nursing unit unattended. They include aggression, paranoia, and delusions. That means hitting or spitting; it means slugging or biting the well-intentioned caregiver who sought to bathe or feed a resident. These behaviors are disturbing to persons with dementia, their caregivers and, in a nursing home setting, other residents.
The Alzheimer’s Association, the American Psychiatric Association, and others have come up with guidelines for managing these symptoms. Their recommendations advise beginning with psychosocial interventions: first tryi to determine the cause of the behavior (perhaps the person slugged the aide who tried to give him a shower because the water was too cold) and then either address the root of the problem (for example, adjusting the temperature of the shower water) or engage in distraction (taking the individual to a “quiet room” with a box of trinkets and other treasures to examine). But then comes the caveat: “Unfortunately, large population-based trials rigorously supporting the evidence of benefit for non—pharmacological therapies are presently lacking.” The evidence for these approaches is largely anecdotal and commonsensical--as is true for anti-psychotic medication.
What is the solution? The Times seems to suggest we should either penalize physician prescribers, fine nursing homes that over-prescribe, add further regulations, or all three. Another alternative is to substitute “Green Houses” for conventional nursing homes. The Green House is a model of nursing home care that seems to be successful in many domains where traditional nursing homes have failed abysmally, so there is some reason to believe it may be a good way to deal with dementia. The Green House project restricts homes to no more than 12 residents, employs a home-like focus, and cross-trains staff members to meet any and all resident needs rather than using a rigidly siloed model. Green House nursing homes may be the answer: they are associated with high rates of family satisfaction as well as a superior track record in controlling the Covid epidemic. I hope Green Houses have the solution, but I can find no published data on their ability to manage severe dementia. In fact, it’s far from clear Green Houses, which currently provide care to a total of 3000 people (out of a total nursing home population of 1.3 million), in 300 facilities (out of a total of 15,600 nursing homes nation-wide), actually care for people who have dementia and troublesome behavior. Their limited numbers, small size, and high cost (45 percent of residents pay out of pocket compared to 22 percent of residents in usual nursing homes) suggest they may be able to cherrypick their residents.
I am not advocating blanket use of psychotropic medications in nursing home residents. Nor am I endorsing mislabeling patients with psychiatric diagnoses they do not have. Antipsychotics should only be used in people with dementia in well-defined and relatively rare circumstances. Perhaps informed consent by the patient's health care proxy should be a prerequisite. But blaming nursing homes or pointing the finger at nursing home doctors or devising new regulations are not the best ways to serve people afflicted with dementia. What we need is to find better ways to alleviate the suffering of people with dementia. We should recognize that the doctors who prescribe antipsychotics are not necessarily lazy or devious, though any who are should be disciplined; many of them are simply desperate, desperate to provide relief to their patients. Medicare is poised to spend billions of dollars on aducanumab, a drug recently approved by the FDA for treatment of Alzheimer’s disease despite the paucity of evidence that it works and the abundance of evidence of its association with severe side effects. Perhaps we should instead devote resources to what is likely to be the more tractable problem, the symptomatic relief of Alzheimer’s disease and other dementias.
July 13, 2021
When the FDA approved Biogen’s new Alzheimer’s drug, aducanumab (brand name Aduhelm) on June 7, the reaction was surprise, dismay and, in some quarters, enthusiasm. But everyone was shocked by the drug company’s audacity in setting the price for the medication at $56,000 per year. As one STAT article put it, the only question about the consequences for Medicare, the insurer for close to 97 percent of Alzheimer’s sufferers, was whether the impact would be big, huge, or catastrophic.
Pharmaceutical manufacturers figured out some time ago that they could bring out “specialty drugs,” typically targeted against a single relatively rare disease, if they charged ten or even a hundred times more than for an average medication. The list price of crizotinib (brand name Xalkori), for example, used against a relatively uncommon type of lung cancer found in non-smokers, is just under $20,000 for a one-month supply—and patients usually take the drug until they die or develop resistance to it. But aducanumab is intended for all people with Alzheimer’s disease, and according to recent Alzheimer Association estimates, that means 6.2 million people over age 65.
The high price is especially disturbing because it’s not even clear that the drug works. A number of studies have been carried out with similar drugs, other monoclonal antibodies that, like aducanumab, were designed to mop up abnormal brain amyloid deposits, which are the hallmark of Alzheimer’s disease—but none of those drugs proved helpful in practice. The trials of aducanumab were likewise discontinued because interim analysis showed the drug was ineffective. Then, in a surprise move, the manufacturer nonetheless applied to the FDA for approval after a reanalysis of the data showed some evidence of benefit when the drug is given in high doses. The independent scientific review panel convened by the FDA to evaluate the data was not convinced, however, with 10 out of 11 members rejecting approval and one abstaining—but the FDA nonetheless approved the drug.
Even if aducanumab does work, “work” means slowing the rate of decline slightly, not stopping or reversing the disease process. And the potential side effects of the drug are considerable: 40 percent of patients experienced brain swelling, in some cases of sufficient magnitude to cause nausea, vomiting, confusion, or visual changes.
Patients and their families, who are desperate for an effective drug against this progressive, ultimately fatal disease, are eager to try something with promise, anything. But they are worried about the side effects of aducanamab, about the need for regular MRI scans to monitor for those effects, and about its high cost.
Most critiques of the new drug—and there are many, the NY Times alone has published eight articles on the subject between June 7, when the FDA announced approval of the drug, and July 9 , and STAT has published at least 16—assume that since it has been approved by the FDA it will necessarily be paid for by health insurers. In the case of aducanumab, that will principally be Medicare. In fact, CMS is not obligated to provide coverage for the drug just because the FDA approved it.
Medicare, like all other health insurance companies, can decide what tests, procedures, and treatments it will cover. The relevant part of Medicare that will be responsible for paying for aducanamab, if Medicare covers the drug, will be Part B: oral medications fall within the jurisdiction of Medicare Part D plans (prescription drug plans), but medications administered intravenously in a physician’s office, such as aducanumab, fall under Medicare Part B. Most determinations of whether to provide coverage for this kind of treatment are made locally, by the private carriers that process Medicare claims. But the decision about coverage can be made nationally if requested by CMS, by the manufacturer, or by members of the medical profession, in which case the decision becomes binding on all the private carriers. Such “National Coverage Decisions” are reviewed by an internal arm of CMS, the Special Coverage and Analysis Group. For particularly controversial decisions, especially if they have social or ethical implications, CMS may request the guidance of the quasi-independent committee, MEDCAC, the Medical Evidence Development and Coverage Advisory Committee. This is a group of 100 experts including economists, ethicists, physicians, scientists and others, from whom a subgroup of 15 is selected to provide in-depth analysis on the particular test or treatment under consideration.
Since its inception, MEDCAC (or its predecessor, the Medicare Coverage Advisory Committee), has issued 348 National Coverage Decisions. These comprehensive analyses have addressed topics as diverse as cardiac pacemakers, Pap smears, and lipid testing. On rare occasions, they have dealt with drugs, for example, an intravenous medicine used in the treatment of heart failure, Nesitiride. When MEDCAC deliberates Medicare coverage for a particular intervention, it can recommend covering the intervention, not covering it, or restricting its use in specific ways. For instance, it advocated coverage of the Left Ventricular Assist Device, an invasive treatment that is almost but not quite an artificial heart, but it required a detailed informed consent process that included a social worker and palliative care expert along with the patient, family, and cardiac surgeon. Ultimately, Medicare approved coverage for the device but set reimbursement at $70,000 (the manufacturer’s price was closer to $200,000) and limited insertion of the device to a handful of medical centers across the country.
Medicare is required by federal law to provide coverage for anything that is “reasonable and necessary” for “the diagnosis or treatment of an illness or injury.” Despite years of often contentious debate, there is no precise definition of what this means. The FDA, by contrast, approves drugs and devices if they are “safe and effective.” In the case of aducanumab, it is arguable whether the drug is truly safe and effective, but surely it would be reasonable and necessary for Medicare to restrict the use of aducanumab to early disease (the only group in whom it was tested) and to require an elaborate informed consent process. While Medicare, by established custom, does not reject coverage based on cost-benefit analysis, it could set the price at a level comparable to those of the existing, only marginally beneficial drug treatments for Alzheimer’s disease, drugs such as rivastigmine (brand name Exelon, which has a yearly retail cost, when given as the brand name drug, of $823) and donepezil (brand name Aricept, which has a yearly retail cost, when given as the brand name drug, of $5380).
Given the controversy swelling around Biogen’s new Alzheimer’s drug, the case for Medicare initiating the National Coverage Decision process is strong. The only reason for failing to do so is external pressure, whether by the manufacturer, by members of Congress under the influence of the pharmaceutical industry, or by the public. If CMS opts against this path or convenes MEDCAC only to reject its advice,* as the FDA did with its advisory committee, that would be a compelling reason to make CMS an independent agency, along the lines of the National Science Foundation, that is under control of a bipartisan board and whose director is independent of the President.
*Between when this essay was drafted on July 9 and edited for publication today, CMS has in fact decided to proceed with a National Coverage Decision.
June 07, 2021
The big news in geriatrics this week was the FDA approval granted to a drug against Alzheimer's disease, the first new drug in 20 years. It reminded me of the day in 1986 when the initial report about what would be the very first FDA-approved drug against this disease appeared.
It was November 13, 1986 and I had been a practicing geriatrician for four years. My weekly copy of the New England Journal of Medicine had arrived right on time, as it did every Thursday. I scanned the table of contents and one article immediately jumped out at me. It had the suitably serious, scientific-sounding title “Oral tetrahydroaminoacridine in long-term treatment of senile dementia, Alzheimer type.”
During four years of clinical practice—plus a year of geriatrics fellowship and three years doing an internal medicine residency—I had encountered patients with what we then called “SDAT,” (senile dementia of the Alzheimer’s type) and which we now simple call Alzheimer’s disease. The cognitive impairments of dementia were to me, to family members of the afflicted, and often to the affected themselves, among the saddest of the many disorders that develop among older individuals. Death, while also very sad, was part of the natural order of things, especially when it came after a long and rich life. But dementia in general and Alzheimer’s disease in particular was devastating because it attacked personality; some would say it assaulted personhood itself. While I would go on to spend much of my career thinking about how best to enable people with dementia (as well as those with physical frailty) to live meaningful lives despite their limitations, I recognized then and continue to believe today that the condition is a scourge that we should strive to prevent, eradicate, or at least ameliorate.
The medication described in the 1986 NEJM article would ultimately be approved by the FDA under the name of Tacrine for treatment of mild to moderate Alzheimer’s disease; it would be supplanted by its first cousin, the drug donepezil, brand name Aricept; and Aricept would top $2 billion in US sales by the time it came off patent in 2013. The story of the drug’s development says volumes about Americans’ desperation for a medical fix to Alzheimer’s, about big business, and about our regulatory system. Both the similarities and the differences between the Tacrine story and the tale of the new drug approved by the FDA for the treatment of Alzheimer’s are illuminating.
That 1986 study ostensibly showing that Tacrine led to improvements in cognition as well as to overall functioning was based on a whopping 17 patients, only 14 of whom actually completed the study. Its lead author was Dr. William Summers, a psychiatrist at UCLA medical center who had never before published anything of importance and had done very little research altogether. The scientific community immediately began questioning not only Summers’ credibility but also his methodology. Did the 17 patients actually have compelling evidence of an Alzheimer’s diagnosis? Did the “global assessment rating” used to measure outcomes translate into meaningful improvement?
After Summers filed for FDA approval for his drug,the FDA investigated Summers and his lab. The agency issued a rare “interim report” in 1991 in which it criticized Summers for the absence of documentation that the study was actually performed as claimed in the NEJM paper. It questioned the randomization process and whether the physicians were, as asserted by Summers, blinded to what drug the patient was receiving. The best the FDA could say was that there was “no clear evidence of purposeful misrepresentation.”
In response, the public vilified the FDA, claiming the agency was “heartlessly impeding the relief of suffering.” David Kessler, the FDA director, received hate mail.
Approval of the drug came, but only after the release in 1992 of a larger more carefully conducted study by the “Tacrine Collaborative.” The trial lasted for 12 weeks and was carried out at 23 centers involving 468 patients. The results, published in the Journal of the American Medical Association, showed a statistically significant improvement in cognition and in overall function, whether measured by physicians or caregivers. And so, the first drug was approved for treatment of Alzheimer’s disease. Sales soared. But questions continued to plague use of the drug—a subsequent study, for example, testing the effectiveness of a higher dose of the drug, found that the higher dose was more effective than lower doses—but more than 2/3 of the patients dropped out of the study. Tacrine was soon effectively replaced by donepezil (Aricept), another cholinesterase inhibitor that differed only from Tacrine in that it is taken once a day rather than twice and has fewer gastrointestinal side effects. Patients and families demanded these drugs, which were soon followed by chemically slightly different but no more effective agents such as Excelon; the drug companies advertised them widely and made a small fortune on their sales; but clinicians remained skeptical. I, for one, believe that the cholinesterase inhibitors are next to useless. None of the numerous studies of the drugs carried out since 1986 have persuaded me otherwise.
Fast forward to 2021 and the approval by the FDA of aducanumab under the brand name of Aduhelm. This is a completely different kind of treatment. Tacrine and Aricept are cholinesterase-inhibitors: they work by increasing the level of the neurotransmitter, acetylcholine, which is dramatically reduced in Alzheimer’s. They are given orally and were never terribly expensive, although Aricept got cheaper when the generic version became available. Side effects, particularly in the case of Aricept, are modest and consist of nausea and very rarely of liver enzyme abnormalities. The new drug, by contrast, is a monoclonal antibody that works to mop up deposits of beta-amyloid from the brain, the chemical widely thought to cause Alzheimer’s disease. It must be given intravenously once a month. The average yearly cost will be set at $56,000. Two years ago, its manufacturer, Biogen, stopped an ongoing clinical trial of the drug after interim analysis failed to demonstrate efficacy. The drug company then re-analyzed the data and claimed it was effective after all, but an FDA Advisory Panel, convened in November, 2020 unanimously concluded there was insufficient evidence of significant benefit to proceed.
Multiple other monoclonal antibodies targeting beta amyloid have also failed, leading some to suggest that by the time these drugs are given to patients, it’s already too late. Or maybe beta amyloid is a marker for the disease and not the cause of the disease. It is in this setting, that the FDA approval of aducanumab is something of a surprise.
What is particularly striking about today’s FDA approval is that it is not based on the clinical effectiveness of the drug. Rather, it is based on its ability to clear the brain of amyloid deposits. There is a long tradition of requiring that drugs cause improvement in clinically meaningful outcomes, not just in “surrogate markers.” Cholesterol-lowering drugs were approved based on their ability to prevent heart attacks, not just on their ability to lower blood cholesterol levels. Antihypertensive drugs were approved based on their effectiveness in reducing strokes, not just on their capacity to lower blood pressure. To be sure, the FDA is holding off on full, unconditional approval until it sees the results of a yet to be performed large clinical trial demonstrating long-term benefit of amyloid plaque reduction. In the meantime, anxious patients and their families will submit to monthly injections of a drug that has been shown to cause symptomatic brain swelling, manifested by nausea, vomiting, visual problems, headaches and sometimes small strokes, in 40 percent of cases.
We are a long way from the trials and tribulations of tacrine, but in the end, is the tale of aducanumab any less disturbing? In both cases, there was enormous pressure by the public to approve a drug that offered hope, some hope, however slim, of ameliorating this terrible disease. In both cases, the FDA, after seemingly acting based on scientific considerations alone, seemed to succumb to external pressures. And in both cases, the pharmaceutical industry stood to gain enormously by release of the drug. The FDA currently has an acting director: President Biden has yet to name a new, permanent head. Maybe it's time for the FDA director to become a civil servant, selected by the FDA members, rather than a political appointee. At the very least, the new director, when chosen, should take a long hard look at decision-making within the organization.
May 21, 2021
On January 12, 2006, I launched this blog, which I first called “Perspectives on Aging” and, a number of years later, reincarnated as “Life in the End Zone.” Initially, the frequency of my posts was erratic but then, a year and a half after the blog’s inception, NY Times columnist Paula Span gave “Perspectives” a boost, recommending it along with just one other blog on aging in her weekly column, “The New Old Age.” She sent me an email at the time, telling me that my readers would expect predictability and that I was now obligated to post weekly and on a fixed schedule.
For years, I faithfully followed Paula Span’s advice, but more recently have been writing only sporadically. Every few years I contemplated retiring the blog, but then I would get an email out of the blue from a reader who told me how useful she found a particular post. Two months before his death in 2019, the distinguished ethicist Dan Callahan, who I was privileged to call a friend, commented in his last email to me that he “particularly appreciated” my piece on “dignity and the insensitive nurses”—a post I wrote about an episode at an area nursing home in which nurses and nursing assistants were cruel and callous to a resident they disliked. How could I stop writing when I received this kind of feedback?
After I published the 400th post I thought surely this was a good time to stop. But then came the pandemic, which disproportionately affected older people. There was a great deal to say, so on March 2, 2020, I began writing more regularly. I wrote about the devastating Covid outbreaks in nursing homes; I wrote about the role of telemedicine; I wrote about vaccines. And then, gradually, as the pandemic began to recede, as vaccination rates in older people soared, and as Covid disappeared from nursing homes, I again found I had less to say. So, when I saw Jane Brody’s column in the NY Times this week, “A Birthday Milestone: Turning 80,” I was inspired. I would write about my own birthday milestone—last week I turned 70.
What Jane Brody says, in a nutshell, is that “the secret of a happy, vibrant old age” is to “strive to do what you love for as long as you can do it.” But she says more about what it takes to live a long and fulfilling life.” First, exercise. Without regular exercise, she opines, “you can expect to experience a loss of muscle strength and endurance, coordination and balance, flexibility and mobility, bone strength and cardiovascular and respiratory function.” Translated into geriatric lingo, what she is saying is that to preserve function, the ability to walk, to do errands, even to dress and bathe without help, regular exercise is important. Next in importance, she says, is “quality fuel,” or a good diet. Here Brody is vague, but stresses avoiding “ultra-processed foods” and eating plenty of fruit and vegetables. Finally, there are “attitude, motivation, and perspective” about which she does not further elaborate.
What Brody is talking about is “successful aging.” For years I have wanted to write about successful aging, as it was called by Rowe and Kahn in their landmark 1987 work of the same title. The idea of successful aging has been the subject of both intense criticism and passionate enthusiasm. One problem is that we all want to lead a “good life,” but we may have very different ideas of what that looks like. Sometimes, what we think we need for a good life turns out not to be what we need at all: people who have a life-altering medical condition, whether Parkinson’s or osteoarthritis or chronic obstructive pulmonary disease may wish they hadn’t developed that disorder but find that they are nonetheless able to lead rich, enjoyable lives.
Since Rowe and Kahn’s original work appeared, the gerontologic literature has discussed “active aging” (to avoid the invidious comparison between success and its opposite, presumably failure). It has talked about “productive aging,” “healthy aging,” “aging well,” or “a good old age.” But these alternative formulations all stigmatize in much the same way as does “successful aging.” The opposite of active is inactive, of productive is unproductive; the opposite of healthy aging is sick aging and the opposite of aging well is aging poorly. The opposite of a good old age is a bad old age. It seems to me that another way of looking at all this is to distinguish between the steps you should take when you are young and healthy to maximize the likelihood that you will retain certain capacities in old age, on the one hand, and the way you should deal with old age once it has arrived, on the other.
What people may aspire to, in addition to simply living longer, includes the ability to take care of yourself (physical function), the ability to think and reason (cognitive function), and (emotional function). But as they begin to become old, whether denoted by reaching eligibility for Medicare or suffering physical or cognitive decline or becoming afflicted with chronic diseases, they need to figure out how to make the best of their existing condition. Whether they become short of breath on exertion due to years of cigarette smoking or due to environmental exposures or due to idiopathic pulmonary fibrosis (idiopathic implying that nobody has a clue what causes this progressive, debilitating condition), they have to make decisions about how best to live their lives, given their limitations. And those decisions reflect their personal preferences (what matters most to the individual), their circumstances (their financial, physical, and social situation), as well as what they aspire to for whatever time they have left.
In light of these distinct considerations—1) planning for the distant future, 2) planning for the near future, or 3) making the most of the current reality—I will offer my personal thoughts on turning 70. These are not prescriptions for other people; they are a description of my thought process, which may serve as an illustration of the kind of process others may wish to go through.
I start with the current reality. I am blessed with good health, which I attribute at least as much to genes and luck as to virtuous past behavior in the realms of exercise and nutrition. I am also fortunate to be financially comfortable enough that I do not need to work. At age 70, I find that I for the most part accept myself as I am, which doesn’t mean I cannot change (either for better or for worse) but rather that I feel I can focus on what I derive satisfaction from doing, not from what I feel I ought to do. That means spending time with my husband, who after 49 years of marriage remains my best friend. It means spending time with my 95-year-old mother and with my three sons, who have become fine and interesting adults. It involves trying to make sense of the world, which I often try to do by reading broadly about about health and medicine, incorporating what I learn from the realms of history, politics, science, sociology, and other disciplines to shed light on current problems. While I will engage in activities that I find meaningful, I will avoid activities that are stressful or create conflict. That has meant giving up seeing patients, which used to make me feel useful and even important, but which increasingly became burdensome as medicine became bureaucratic, patients became litigious, and disease remained as intractable as ever. I also want to devote more time to arguably purely selfish activities such as exploring the worlds of novels and of nature.
When I plan for the near future, say the next five to ten years, maybe longer if I’m lucky, I think of this as investing. Not in the stock market or the bond market, though insuring there will be sufficient retirement money to live comfortably is certainly important, but rather in my physical health and physical functioning. This is where exercise comes in, both aerobic exercise to guard against cardiovascular disease and strength training to remain nimble. Strength will be essential to enable me to continue to climb stairs and lift my new granddaughter and any other grandchildren who may come along. I also need to invest in building and deepening friendships, since I am persuaded that over the long run, the best bulwark against depression will be a strong social network. Finally, I want to continue to find ways to be engaged with the world, not just through friends and family. For me, that means remaining intellectually engaged.
As to planning for the distant future—it’s too late for that. Truly long-range planning involves decisions about diet and exercise when you’re in your 20s and 30s; it entails deciding early on not to smoke; it means getting an education (education decreases but by no means eliminates the chance of developing dementia in later life).
As I enter a new phase of life—which feels more like a new stage because I recently became a grandmother, not because I had a birthday—I am going to make a conscious effort to develop new interests and new activities. Unlike Jane Brody, who advocates doing whatever you are passionate about as long as you can in large measure, I suspect, because she herself continues to be passionate about the same things she has always loved, I find that my enthusiasm for clinical medicine waned, as did my excitement about other aspects of geriatrics. I want to move more in the direction of reading, thinking, and ultimately writing about the history of medicine, and how that can help shape contemporary health policy. Recognizing that interests change over the life course, I gave up the practice of medicine. I’m not quite ready to let go of this blog, but I will write when there is a topic relevant to “life in the end zone” about which I feel strongly. I’m no longer going to peruse the New England Journal of Medicine and JAMA weekly for new developments that I might write about as I increasingly feel that what is published in medical journals no longer excites me the way it once did. I will still read Health Affairs and I’m expanding my horizons to include the Bulletin of the History of Medicine. I expect that my eagerness to blog will wax and wane. I hope you will bear with me as I begin to think about the end zone in a new and very personal way.
April 05, 2021
Now that just under half of older people have been fully vaccinated against Covid-19 and only about a quarter have not received any vaccinations at all, the burning question is, what can vaccinated people do safely?
The answer comes in two parts: what can vaccinated people do that does not jeopardize their own health and what can they do without risking harming others? The CDC has weighed in on this, focusing principally on the first issue, safety of the individual. Their guidance includes the recommendations that those who are fully immunized (who are at least two weeks out from their second shot) can visit other fully immunized people indoors without masking or social distancing and that they can travel without self-quarantining upon arriving at or returning from their destination.
To answer the second question, the public health concern, we need to know whether a vaccinated individual can be infected with Covid-19, remain asymptomatic, and transmit the disease to an unvaccinated person. Physicians have been concerned that while the antibody response to vaccination is highly effective in squelching the virus in the lungs, what’s not clear is whether it’s also effective in killing the SARS-Cov2 virus in the nasal passages. If so, vaccinated individuals could indeed be surreptitious sources of disease, like the notorious Mary Mallon, who was an asymptomatic carrier of the bacteria causing typhoid fever, salmonella typhi. Could asymptomatic Covid carriers act like “Typhoid Mary,” perhaps earning the nickname Covid Cathy? At last, we have very reassuring data addressing this issue.
The current issue of MMWR, the weekly journal published by the CDC, reports on the experience of just under 4000 people during the period mid-December and mid-March, 2479 of whom received two shots of either the Pfizer or Moderna vaccines and 989 controls who remained unvaccinated. They also report on 477 people who received one dose, but for simplicity, I will ignore these partially immunized individuals. The investigators leading this small study took one crucial step that has previously been largely lacking: they tested all the participants weekly using the gold standard polymerase chain reaction (PCR) test for the SARS-Cov2 virus—whether or not they had been vaccinated and whether or not they reported symptoms. What did they find?
The outcomes are reported based on “person-days” since the group who were vaccinated got their shots at varying times and therefore differed in the number of days they could have become infected. They found that among the fully immunized, the number of new positive tests/1000-person-days was 0.04 whereas among the unvaccinated, it was 1.38. The bottom line: once you are fully immunized, you are far less likely to test positive for Covid-19 than if you have received no vaccinations.
The study also found that only 23 percent of the people who did develop an infection got sick enough to see a physician, only two people were hospitalized, and no one died.
Until this admittedly small but carefully conducted study appeared, it seemed to me that while vaccinated people could feel personally quite safe, they had to exercise caution in the interest of public health. It wasn’t really true that vaccinated people could socialize indoors with other vaccinated people—until the issue of Covid Cathy was resolved, they had to be worried about about any unvaccinated household contacts their friends might have, lest an asymptomatic carrier inadvertently transmit the virus to a friend, who while also asymptomatic, manages to give the virus to an unvaccinated household member. Now it increasingly looks as though this theoretical concern is not, in practice, of great consequence.
Just because fully vaccinated individuals are reasonably safe today doesn’t mean they will necessarily remain safe tomorrow. Vaccine effectiveness is calculated based on how much less likely a vaccinated person is to get the disease than an unvaccinated one. But if the disease is running rampant in the surrounding community, that is, if it is quite common among the unvaccinated, then while the relative risk of the vaccinated will be unchanged, the absolute risk will go up. And if new variants appear against which the vaccines offer only limited protection, then the relative risk itself will be affected.
So, keep an ear to the ground—monitor how common the virus is in the community where you live and pay attention to the type and pervasiveness of viral variants. If the situation is stable, enjoy your freedom.
March 04, 2021
My 95-year-old mother has been using a computer for email since our then teenage son arranged to gift her his old computer so he could get a new one. That was 25 years ago. But like most people in her age cohort, she has never been comfortable with the technology and has trouble learning anything new related to the computer. The difficulty has gotten worse over time along with her memory. But when Covid hit and visits to the independent living complex where she lives were restricted and then eliminated, the limitations of a landline telephone became all too evident. If my mother could make and/or receive video calls, she could communicate with me, with her three grandsons in California, and with friends. But using the video technology proved to be an endless source of frustration. We tried FaceTime, we tried Skype, we tried Zoom. Nothing worked.
Now, after months of trial and error and refining the approach, I’m pleased to report that my mother can receive—and sometimes initiate—FaceTime calls. I’m so pleased that I’m going to use this blog post to describe in as much detail as I can recall every step necessary to accomplish this feat, suspecting as I do that others may find themselves in a similar predicament.Happy geriatric iPad user
Step 1: Choose an appropriate device. I purchased my mother a new 10.2 inch, 32GB iPad. It’s portable, so she can use it while sitting in her favorite recliner. The screen is big enough so that people’s faces appear almost life-size and photographs are easy to see. In principle, Apple products are user-friendly, though as it turned out, my mother is a genius at outwitting the human-computer interface gurus at Apple by coming up with ways to make the system fail. Nonetheless, I think the iPad was probably as good a choice as any and better than some. The rest of the steps below apply primarily to an iPad.
Step 2: Obtain a cover that automatically turns the device off when it is closed and turns the device on when it is opened. Turning the iPad on manually was an unnecessary obstacle.
Step 3: Disable password protection for turning the device on. This may be a bit risky, but my mother was having trouble remembering her password. I “enrolled” her in touch ID, but she usually managed to put her finger in not quite the right place, so it did not work reliably. Nothing is more frustrating than being unable to even turn the thing on.
Step 4: Go to Settings, Accessibility, Assistive Touch. This setting allows my mother to use the iPad even though she has poor fine motor control and touches the screen erratically.
Step 5: Label the home button. I stuck an arrow on either side of the home button to help my mother find it. The device is designed with a very slight indentation signaling the home button, so slight that it’s hard for 95-year-old eyes to see. ➡️ ⏺ ⬅️
Step 6: Make sure Siri is disabled. I initially thought it would be easiest if my mother used Siri to make calls, simply saying “hey Siri, call Muriel Facetime video.” Wrong. She would leave out “FaceTime” or leave out “video” or forget to start with “hey Siri.” She felt compelled to speak in grammatically correct sentences, as though Siri would understand her better that way. When I left Siri enabled, just in case things changed, I found that my mother would sometimes hold the home button down too long and inadvertently invoke Siri, who would helpfully inquire “may I help you?” Having her device suddenly speak really rattled my mother.
Step 7: Put only the most essential icons in the dock. For my mother, this includes the icon for her email, Safari, and for FaceTime video. I’ve recently added the photos icon.
Step 8: Declutter the screen by putting as many of the obligatory icons, the ones you can’t get rid of, on the next screen, not the screen that is opened when the device turns on.
Step 9: Put the handful of phone numbers (with associated names) that are most likely to be used in the FaceTime contacts screen. This way, when my mother taps on the video icon, she will see 4 or 5 names and can choose which one she wants to call. Sometimes she taps on the wrong spot and calls the wrong person, but at least she’s not accidentally going to call Social Security or the Boston Globe, just a different family member from the person she intended.
Step 10: Practice! When visiting my mother, I would get her settled in her recliner with the iPad and call her from another room. For a while she had trouble with the command “slide to answer.” I finally figured out that she was carefully sliding her finger along the words “slide to answer” and assiduously avoiding the green virtual button to the left of the words. Unless she accidentally touched the button, she failed to answer the call. Now I regularly remind her that she needs to slide the button and it works like a charm. Another aspect of practicing is using the system regularly. At one point, my mother was doing great and then we didn’t make any video calls for a few days, by which time she had forgotten about sliding the button rather than the words. Making or receiving a call once a day is probably a good idea.
Sounds simple, doesn’t it? Since it literally took me months to figure this out, I thought I’d pass along what I learned, in case these steps can help someone else.
February 26, 2021
This week brought a medical article worth discussing: the New England Journal of Medicine published the results of a study of the efficacy of the Pfizer SARS-CoV-2 vaccine in the real world. The article provides compelling evidence that the vaccine works extremely well.
The data come from Israel, which has been doing a superior job of vaccinating its citizens. As of a week ago, two-thirds of the currently eligible population in Israel had gotten both of the recommended doses (individuals under age 16 and those who have had Covid-19 are not eligible). In Israel, health insurance is mandatory for all permanent residents; they must join one of four healthcare organizations called “funds.” The new study reports data from Israel’s largest health organization (Clalit Health Services) and includes information on a stunning 1.2 million people.
The study’s authors, led by Dr. Noa Dagan, used Clalit's integrated electronic medical record to capture health data for 596,618 people who received both doses of the Pfizer vaccine between December 20, 2020 and February 1, 2021. They then matched them, based on demographic and clinical characteristics, with another group of identical size who had not received any vaccine. Next, they looked at five different outcomes: documented Covid-19 infection; symptomatic Covid-19 infection, Covid-related hospitalization, Covid-related severe illness, and Covid-related deaths. Because the sample was so large, they were also able to collect extensive information about a number of interesting subgroups defined by age or specific co-existing health conditions such as cancer or diabetes.
The article includes an enormous amount of intriguing data. The most exciting results, from my perspective, address outcomes a week or more after receiving the second dose of the vaccine. At that point, the vaccine conferred 94% protection against symptomatic Covid-19 (95 percent confidence interval 87-98), 87 percent protection against Covid-related hospitalization (CI 55-100), and 92 percent protection against severe Covid (CI 70-100). The efficacy in people over 70 was identical to those in younger individuals, and the rate in people with one chronic health condition such as diabetes was only slightly lower than in people without the condition.
These numbers strongly resemble the results that Pfizer and BioNTech reported to the FDA in their application for emergency approval. But, as the study authors point out, Pfizer drew its conclusions based on 44,000 people; the Israeli study is based on 1.2 million people. As a result, when Pfizer calculated the efficacy against severe Covid-19, they drew on a total of 10 cases (one of whom had been vaccinated and 9 of whom had not been); when the Israelis calculated the risk of severe Covid, their estimate was based on 229 cases, vastly increasing the credibility and certainty of the calculation. Moreover, Pfizer’s data was based on the somewhat artificial conditions of a clinical trial: for example, the subjects were all highly motivated to optimize their health and may have regularly worn masks and practiced social distancing; the Israeli study drew on real life experience, in which participants’ behavior reflected community norms.
The new study, like all studies, has its limitations. It excluded people living in nursing homes and medical personnel working on Covid units in the country’s hospitals, arguing that the rate of the disease in their particular communities, i.e. the nursing home or the Covid ward, was highly atypical. The study was performed during a period when the South African variant was very rare in Israel so we cannot draw conclusions about the efficacy of the vaccine against this strain. The information on the ability of the vaccine to prevent Covid-related deaths is limited because of the short follow-up period: there were nine Covid deaths in the fully vaccinated and 32 deaths in the unvaccinated group, but these numbers might change when more time elapses. The data on deaths may also be difficult to generalize as Israel has an unusually low case fatality ratio: in Israel, according to Johns Hopkins' "Our World in Data," is currently 0.7 percent whereas in the U.S. it is 1.8 percent.
Some of the study’s greatest strengths are also potential weaknesses: the “real world” nature of the investigation means it is an observational study rather than a randomized controlled trial, raising the possibility that the differences in outcomes between the vaccinated and the unvaccinated were related to some factor other than their vaccination status. Despite these limitations, the study provides very encouraging information.
The fact that the Israelis could carry out their study sends another message over and beyond the efficacy of the Pfizer BioNTech vaccine. The study could only be conducted because Israel did a good job acquiring and distributing vaccine. Early on, the country developed mass vaccination sites. Since everyone is enrolled in a health plan and the plans all have electronic records, there was no need to waste as much time on bureaucracy as we do in the U.S, where more time is spent filling out forms than on administering the shot. The study could only be conducted because of Israel’s electronic health records, which assured that information about who got what dose when, and the age, sex, and chronic medical conditions of each individual was digitally recorded. Finally, the entire rollout was centrally coordinated, assuring efficiency and consistency: from the outset of the pandemic, the Israeli Ministry of Health collected Covid-related data from all four health plans, negotiated to purchase vaccine from Pfizer, and organized distribution. The good news reported in the NEJM article is a result both of the biological properties of an mRNA vaccine that was designed in record time to deal with an international health crisis of enormous proportions, and of the characteristics of a health care delivery system that can actually deliver.
February 21, 2021
As new cases of Covid-19 fall throughout the world but the US approaches 500,000 deaths from Covid-19 and the world nears 2.5 million deaths, it is time to start planning for the next pandemic.
We have known since the 1918 influenza pandemic, which killed upwards of 50 million people world-wide, that it’s not a question of if, but rather of when. Moreover, recent decades have seen the emergence of several new and terrifying diseases. These diseases have principally been caused by viruses, viruses that jumped species. They moved from their usual host, say a civet or a bat, into people for one of several reasons: climate change may have destroyed their hosts’ usual habitat, forcing them to find a new home where they came into greater contact with humans; alternatively, humans encroached on the hosts’ habitat by clearing forest or planting a crop that deprived the host of its usual food source, again leading the host to relocate; or humans may have developed a taste for certain types of wild animal, bringing the two species into unaccustomed contact and thus facilitating viral transmission.
As a result of these factors, we have had Zika, SARS, MERS, avian influenza and now Covid-19 in the twenty-first century, and Ebola, Marburg hemorrhagic fever, and HIV in the last part of the twentieth century. These are only the best-known of “zoonoses.” Today, 75 percent of new infectious diseases are zoonotic in origin and their numbers have been rising steadily.
The good news is that we know a great deal about how to go about preventing outbreaks, detecting them early, and responding if they nonetheless occur. The bad news is that the world in general and the U.S. in particular have a poor track record of implementing the necessary strategies. Allocating scarce resources now to help alleviate problems that will develop at some unspecified time in the future has proved to be a hard sell.
The irony is that we in the U.S., as in many other countries, spend an enormous amount of money on our military. We have accepted the need to devote a large fraction of our budget to the armed forces and to equipment including both “conventional” and nuclear weapons. We have not yet acknowledged that the far greater threat to our national security and our well-being is from lowly viruses, strange biological entities that are not strictly speaking alive since they cannot survive outside a host organism, not from invading armies.
The current US budget consists of just under $3 trillion on “mandatory spending,” which includes Social Security, Medicare, and Medicaid; and another nearly $1.5 trillion on “discretionary spending,” over half of which is for military spending, including the VA and Homeland Security as well as the armed forces. The base budget for the Department of Defense is $636 billion.
By comparison, the CDC (Centers for Disease Control), the site for most of the U.S. epidemic preparedness activities, has a total budget of $6.6 billion, of which $509 million is allocated to “Emerging and Zoonotic Infectious Diseases.” Other disaster preparedness activities are financed through various departments, including Homeland Security, which is part of the military. But as a very rough approximation, it is not far-fetched to say that the core budget for potential epidemics is $509 million compared to the core budget for the military of $636 billion, or .08 percent. This comparison reveals an enormous imbalance between spending on the military and on epidemic preparedness, with too much to fight armed invasions and not nearly enough to combat microbial enemies.
If we are to spend more on epidemics—and, arguably, less on bombs and fighter planes—what should we spend it on? A basic framework was outlined at a symposium called “Building Interdisciplinary Bridges to Health in a Globalized World,” organized by the Wildlife Conservation Society in 2004. The symposium called for an international, interdisciplinary approach to preventing disease, or “One Health, One World.” It articulated its views in a document called the Manhattan Principles which laid the foundation for what would become an international movement. The Manhattan Principles is built on the recognition that modern epidemics stem from the inter-connections between humans, domestic animals, and wildlife, and that these interactions arise either directly from human behavior (eg agricultural practices, clear cutting forests, and eating wildlife), or indirectly, mediated by climate change that is in turn due to human behavior. Since the problem is fundamentally multidisciplinary, its solution must likewise be multidisciplinary. And since the modern world is interconnected, the solution must be international, involving sharing information.
An implementation framework was drawn up in 2008 by a group consisting of representatives from UNICEF, WHO, the World Bank among others. Entitled “A Strategic Framework for Reducing Risks of Infectious Diseases at the Animal-Human-Ecosystems Interface,” it argued for the development of an international system of disease surveillance drawing on multidisciplinary expertise (to include veterinarians, physicians, wildlife specialists, and ecologists). In addition, it sought to help build robust public health systems across the globe and to promote good communication between those systems. Finally, it advocated support for strategic research, to be shared internationally.
The One Health approach was adopted by the CDC in 2009, which housed it within its National Center for Emerging and Zoonotic Infectious Diseases. It was formally endorsed by the UN, the World Bank, and the EU in 2010. More recently, the World Bank came up with a revised operational framework to fight EIDS as a means of fulfilling its mission to promote prosperity and decrease poverty.
Our response to future epidemics, when they occur, will hinge on more than international and multidisciplinary collaboration. Scientific developments are likely to have a major impact when future EIDs arise. The new technique of vaccine design using mRNA is vastly accelerating the development of effective vaccines, the most powerful preventive tool available. Work on anti-viral medications is ongoing and could revolutionize treatment of viral diseases much as antibiotics revolutionized the treatment of bacterial diseases. Currently, the only virus for which there is effective treatment is HIV, and that treatment (which took years to develop) involves a multi-drug regimen that converts HIV into a chronic disease but rarely eradicating the infection.
We also need to strengthen the public health infrastructure in the U.S. Poor coordination, insufficient manpower, and inadequate communication to the public have afflicted domestic public health departments for years. WHO and the World Bank have focused on shoring up public health in much of the world but assumed that the richest countries would serve as models of success.
The One World framework could itself be expanded to address climate and the environment more expansively, but the basic formulation is sound. As Andrew Cunningham, Peter Daszak, and James Wood argue in their 2017 article, “One Health: Emerging Infectious Diseases and Wildlife: Two Decades of Progress?” little has been done at the policy level to address what remain major threats to health and well-being, as Covid-19 attests. It’s time to adjust our national priorities and focus on what counts.
January 11, 2021
The US also has the dubious distinction of being number one in the world in terms of cumulative mortality from Covid-19.
January 01, 2021
Once the 1918-1919 influenza pandemic finally came to an end—after killing somewhere between 50 and 100 million people worldwide—Americans did their best to forget about it. Later tragedies such as AIDS and 9/11 figured prominently in much American fiction, but influenza was seemingly forgotten by American writers: Katherine Anne Porter’s short story, “Pale Horse, Pale Rider” and William Maxwell’s novella, “They Came Like Swallows,” are rare exceptions. Historians and journalists writing about the 1918 flu have hypothesized that the pain and suffering inflicted by the flu paled by comparison with that attributable to World War I, which came to an end at the same time, even though ten times more Americans died of the flu than died in combat. Or perhaps Americans were so optimistic about scientific medicine, which was just coming into its own in the twentieth century, that they chose to ignore medicine’s great failure, its inability to diagnose, treat, prevent, or cure influenza. Maybe Americans simply repressed this traumatic episode that killed people in the prime of life, leaving families without a means of support and children without a mother or father. Will the Covid-19 pandemic similarly be forgotten, or will it have a profound and enduring effect on us as individuals and on us as a society?
The pundits are already speculating about the long-lasting effects of the pandemic on the real estate market and on the work place, on professional conferences and the movie industry. But what I would like to address is the life lessons we should take away from this devastating and unexpected year. The first is that our lives are tenuous. We in the developed world have come to expect a long healthy life, especially if we are white and middle class. Life expectancy at birth in the US is just under 79 years; if you make it to age 65, you can expect to live another 20 years. Covid-19 showed us that we should not take those years for granted: while 80 percent of the Covid deaths have been in people aged 65 or older, that means that 20 percent have been in people under 65. As of the end of December, 2020, 346,000 Americans had died from the disease, which translates to 69,000 younger people. There’s nothing like awareness of our own mortality to concentrate the mind and encourage us to live life well and to the fullest. This is the first lesson and the one we are perhaps most likely to forget.
The second lesson is that what matters most to us as human beings is our relationships with other people. That’s what made “social distancing” so painful; it’s why eliminating family visits to nursing homes was so devastating; it’s why Zoom, FaceTime, and other video chat programs have been such a lifesaver. We need to cultivate our friendships, to nourish them, to work to improve them. The pandemic made us believe that other people are the enemy, which runs counter to our essence as social creatures.
The third lesson that I want to emphasize is of a different sort: it is that to make decisions about most anything important and certainly to make medical decisions, we need to understand something about risk. How to behave during the epidemic was all about how to evaluate risk, how to think about risk. Just because most people who don’t wear masks and who go to group gatherings won’t get sick doesn’t mean that these are safe activities. It means that you markedly increase the chance that you will contract the virus if you go around without a mask or attend a group meeting. And understanding risk is more complicated still: how much you increase your risk depends on how widespread the virus is in the surrounding community. If very few people in the vicinity of where you live are sick, then your likelihood of getting the disease is low, even if you fail to take precautions. But as the virus begins to circulate more widely, then precisely the same behavior pattern that was only slightly unsafe before will become far more dangerous.
Understanding risk is tricky because the epidemiological measures designed to protect individuals, whether wearing a mask, practicing social distancing, or getting vaccinated, are not perfectly effective. Some people who wear a mask will nonetheless contract the virus; ditto for people who stay six feet away from others. Individuals who received either the Pfizer or Moderna vaccination in the clinical trials were one-twentieth as likely to get sick as those who received a placebo. But that means that just how safe you can feel if you are vaccinated (even if the effectiveness holds up in a much larger population than was tested in the trials) also depends on how widespread the virus is: while vaccination lowers your relative risk of getting sick, if the number of infectious people in the community suddenly increases, say by a factor of ten, your chance of getting the disease also goes up by a factor of ten, even if you've been vaccinated. Grasping the concept of risk is essential—not just to dealing with an infectious disease, but also to deciding whether to undergo screening for prostate cancer, whether to take medication for borderline high blood pressure, and whether to invest in the stock market.
Americans, along with people across most of the globe, have lost much from our encounter with the corona virus. We have also gained something: an appreciation for life’s fragility, a recognition of the importance of relationships, and a deeper understanding of risk. It is up to us to remember, both those we have lost and what we have learned.