March 20, 2017

What We Pay

The Princeton health economist, Uwe Reinhardt, first said it in 2004. The private think tank, the McKinsey Global Institute, persuasively demonstrated it was true in 2008. But maybe now that the Wall Street Journal is saying the same thing, policy makers will listen. The elephant in the room, the main factor accounting for the high cost of health care in the US, is prices.

The spending gap between the US and other developed countries remains huge. We spend 17 percent of GDP on health care (that’s all spending, public and private combined); our closest competitor, Switzerland, manages to spend 11 percent. Other OECD countries, such as New Zealand and Norway, spend closer to 9 percent. And despite all the excess spending, we don’t have better outcomes across a broad range of measures, from infant mortality to life-expectancy.

The main culprit, the WSJ reports, is higher prices in the US. The average price of most prescription drugs is higher here—by a lot. Avastin (an expensive medication but not the most expensive medication there is) costs $4000 for a 400 mg vial in America and less than $2000 in western Europe. Ditto for procedures: the cost of coronary artery bypass surgery in the US is $80,000, compared to half that in other OECD countries. And so on, down the line.  Elsewhere in the world, the WSJ explains, state run health systems set limits on prices or refuse to pay a supplier if the cost is regarded as excessive. Our free market system, far from keeping costs down, drives them up.

The McKinsey Report, though a few years old now, makes further adjustments based on a country’s wealth. It argues that richer countries may want to spend a larger proportion of their income on health care. But even adjusting for greater GDP per capita, the US spent $650 billion more than “expected” in 2006. The fastest growing part of the excess, the study showed, was due to outpatient care, both office visits and ambulatory surgery. And what was driving up costs in these domains wasn’t the frequency of visits—Europeans tend to go to the doctor at least as often as their American counterparts—it was the cost per visit. Other major contributors to the high cost of American health care are drug pricing (McKinsey found, as did the WSJ, that we pay more in the US for a given drug than we would in other OECD countries) and the cost of health administration (all the spending on marketing and administration of multiple private health plans boosts costs way over what they would be with a single payer).

I think it’s fair to conclude that the high cost of American medicine isn’t solely—or even mainly—due to waste. Targeting the use of less-than-optimal therapies in outpatient practice, as the Choosing Wisely campaign does won’t solve the cost problem. Nor will targeting expensive, burdensome, and unwanted treatment near the end of life. These are important efforts to improve quality of care. But if we want to do something about cost, we need to have an impact on prices. That means cutting payments made by insurers (both Medicare and private insurance companies) to pricey specialists. It means allowing the biggest and most influential insurer of all, Medicare, to negotiate with drug companies about price. It means allowing insurers such as Medicare to pay for devices based on their cost-effectiveness, not based on what the manufacturer charges. 

Introducing single payer health insurance would help, too. It happens to be the only other way to cover all Americans and make health insurance affordable and get rid of pre-existing conditions riders without use of the “mandate” that Republicans find so very unpalatable. But that’s a topic for another day.


March 13, 2017

You Don't Get What You Pay For

The enormous interest in getting good “value” for every dollar spent on health care, whether by individuals, insurers, government, or anyone else neglects certain basic realities—for example, that medical care isn’t a consumer good like toasters: it’s a very sophisticated service provided by highly trained professionals;  and that health insurance by its very nature makes the operation of a free market impossible. There’s still another basic reality that is even more often neglected, and that is the widespread belief that “you get what you pay for.” Or, if you pay less for one treatment than another, the cheaper one is necessarily inferior. Any claims that the two are of equal quality are suspect. And claims that the cheaper one is higher quality are, on their face, deemed outlandish.

Translated into practice, this means that patients and doctors alike tend to assume that more is better. More x-rays (or, as plain radiographs, CT scans, MRIs, and PET scans are collectively known, “imaging studies”), more medications, more doctors is superior care and must result in better outcomes. As a result, I’m not at all surprised that changing physician behavior and patient expectations has proved difficult, even when professional guidelines assert that less is more. And unfortunately (unfortunate since, from a geriatric perspective, less often is more), a new study that purports to show that greater spending per hospitalized patient fails to improve outcomes is hardly convincing.

Previous retrospective studies, especially those comprising the Dartmouth Atlas of Health Care, have shown that expenditures on apparently similar patients differ by geographic region, by hospital, and within regions—without any measurable difference in outcomes. But the Dartmouth Atlas has been criticized for working backwards from death even though death could not have been predicted in advance, it has been criticized for failing to adequately consider differences between the patient population in different locales, and it has been critiqued for not acknowledging that patient preference might account for some of the observed differences in health care utilization and, as a result, in cost. The new study asks whether physicians working in the same hospital nonetheless exhibit differences in their pattern of test- and treatment-ordering and whether that variation results in different outcomes for their patients. Looking at over 1.3 million hospitalizations occurring at over 3000 hospitals and involving 72,000 physicians, they found large variability in expenditures and no difference in outcomes—just like the Dartmouth Atlas findings.

The authors were careful to look at Medicare Part B spending because this is involves services that are at the discretion of physicians (Part A spending is determined largely by the DRG, the reason for admission, and is set by Medicare) and is a “proxy” for the intensity of resource use by physicians. They were careful to confine their analysis to Medicare fee-for-service beneficiaries who were age 65 or older and hospitalized for an acute medical condition. And they examined separately the behavior of general internists and hospitalists. They made some adjustments to account for differences among patients, including age (in 5-year increments), sex, race/ethnicity, median income, and existing comorbidities, and other adjustments to account for differences among physicians, including age (also in 5-year increments), sex, and site of medical school education. They found that the variation in spending across physicians within a hospital was greater than across hospitals. Among hospitalists, adjusted spending was more than 40 percent higher among doctors in the highest spending quartile compared with the lowest quartile. And higher expenditures had no effect on either the 30-day readmission rate or mortality, the two measures of quality used to examine outcomes.

Regrettably, this study has a number of glaring weaknesses. First, there are the odd omissions: the authors report on the gap between the highest and lowest quartiles of hospitalists but not the corresponding figure for general internists, even though nearly twice as many patients were cared for by internists than by hospitalists. Next, it’s not clear that the two outcomes examined—mortality and readmission rate—are good indicators of quality. Or rather, even if the two groups were indistinguishable based on these two measures, perhaps one group fared far better than the other on some other measure that wasn’t looked at, say quality of life. Finally, the study wasn’t randomized and it wasn’t prospective, allowing for the possibility that there were important differences between the patients on whom much money was spent and those on whom less was spent. In fact, maybe the patients on whom more resources were expended were sicker. If they were sicker but had the same mortality rate and readmission rate as those on whom fewer resources were spent, then arguably they fared better than their counterparts!

So where do we go from here? Contrary to the prevailing wisdom, the answer may not lie with “big data.” Too many things are going on at once with these patients to be able to reliably conclude that ceteris paribus, all things being equal, overall expenditure on tests and treatments had no bearing on outcomes. I think it would make sense to look at a small number of detailed case examples—20 or 30 patients of the same age with the same admitting diagnosis, matched for severity of illness, co-morbidities, race, ethnicity, and socioeconomic class, some of whom are cared for by prolific test-orderers and some of whom are not—following them prospectively over time to see what happens to them. And the study would try to ascertain why various choices were made, perhaps by interviewing the patients and/or their doctors, perhaps by gleaning the answer from free text in medical records, and what their outcomes turned out to be.

March 06, 2017

A Piece of My Mind

Prescription medications cost more in the US than anywhere else in the world and costs have been skyrocketing each year for the past several years. To a growing extent, the burden of the high cost falls directly on consumers, either because their health plan has a tiered system for medications (charging ever larger co-pays for some drugs), because their health plan pays only a percentage of the charge for various drugs (and if the consumer has to pay 20 percent and it’s 20 percent of a very large number, that’s a major outlay), or because they have a high-deductible health plan and the insurer doesn’t pay anything until they have spent $3000—or $5000 or $10,000—on health care. 

By and large, pharmaceutical companies have been blamed for the high cost of medicines, with insurers shouldering some of the blame, thanks to complicated and ungenerous policies. Pharma has tried to justify its sometimes astronomical charges as necessary to support its research efforts, with the most recent industry-endorsed estimate for the cost of developing a new drug and bringing it to market now topping $2.6 billion. Other analyses attack the methodology used in this report to measure costs, arguing that it fails to take into account, for example, that NIH funds much of the research that goes into discovering a new drug, not the pharmaceutical industry. The result is a dramatic over-estimate of the cost borne by industry. Concerns about the role of drug companies and to some extent health insurers are entirely legitimate. But there has been little attention paid to the role of drug stores in contributing to the high cost of medicines.

I did a little bit of investigating today. I looked at what two commonly used medications would cost a family like mine who had exceeded the $4000 deductible for their health plan, what they cost today (given that it’s only early March and most people haven’t had the opportunity to spend $4000 on medical care this year), and what they would cost if they were obtained from a Canadian mail-order pharmacy. Here’s what I found for one of the medicines, the widely used nonsteroidal anti-inflammatory drug, Celecoxib.

Celecoxib is used as a treatment for arthritis in people with certain gastrointestinal conditions because it's a little less prone to exacerbate these problems than other anti-inflammatory drugs. It is available generically. The non-profit insurance company Harvard Pilgrim Health Care classifies generic Celecoxib as a “tier 1” drug. That means that the cost of a 3-month mail order supply of the medication (100 mg taken once a day) from Walgreen’s, the pharmacy with which Harvard Pilgrim has a contract, would be $10. But until the deductible is met, Harvard Pilgrim doesn’t pay for medications, so the cost would be a whopping $156.51 for 90 pills (which, incidentally, isn’t even quite a 3-month supply since last I looked, a year has 365 days, not 360 days). So I contacted a pharmacy in Canada, identifying one that is approved by CIPA, the Canadian association of licensed retailed pharmacies.  I found a drug store that will supply 120 pills for $25.99 (plus a small shipping charge). That comes out to $1.74 per pill at Walgreen’s compared to 22¢ at the Canadian competitor. Walgreen’s costs eight times as much as the Canadian pharmacy. And the medication isn’t manufactured in some shady country with questionable oversight. It’s made in the UK.

How can this be? Is Celecoxib a fluke? So I looked at another commonly prescribed medication, this time a drug classified as tier 3. I chose Vagifem, an estrogen suppository, used to treat post-menopausal atrophic vaginitis. The cost of a 3-month supply through the health plan—after using the entire $4000 deductible? $80. The mail order cost from Walgreen’s today, assuming the deductible hasn't been spent? $360. The cost from the Canadian pharmacy? $55. Made in the UK. Walgreen’s is 7 times more expensive.

What’s going on here? I’ll leave that to the policy wonks, but maybe they should look at the behavior of pharmacies as well as drug companies and health insurers. After all, this isn't just a case of generic drugs costing almost as much as their brand name equivalents, which is still another problem for consumers. Meanwhile, importing drugs from Canada is a valuable option. It's not legal to re-import medication for sale or to import restricted drugs such as opioids, but the law on medicines for personal use is a bit fuzzy, or at least its enforcement is. With consumers shouldering an ever increasing proportion of health care costs, and no prospect for relief in sight, there's a strong incentive to look north. 

February 25, 2017

Stiff Upper Lip?

The British, I’ve argued previously, are ahead of us in health care for older people. They have more robust geriatric and palliative care programs than we do. They screen for frailty in older people and have a strategy for addressing the needs of those found to be frail. They devote a larger fraction of their resources to primary care (as opposed to specialty care) than we do, which benefits the aging. And data from the Commonwealth Fund consistently show that even though the UK spends a smaller percentage of its GDP and much less per capita on health care than does the US, health outcomes are typically at least as good and often better. In the fund’s most recent report, for example, the US does well in cancer care but has higher mortality from ischemic heart disease and higher rates of diabetic complications than the UK: death rate from IHD was 128/100,000 in the US compared to 98/100,000 in the UK and amputations in diabetics occurred in 17.1/100,000 in the US compared to 5.1/1000 in the UK. So the report published this month called “Health and Care of Older People in England 2017” was of great interest.

The basic demographic reality in England is the same as in the United States: the population is aging and the oldest old, those over age 85, are the fastest growing subset of the older cohort. And the economic reality in England may well foreshadow its American counterpart: over the last several years, the UK has been in the grip of belt-tightening fever, as government spending on both medical care and social services has been cut or its rate of growth slowed. The net effect is that gains in life expectancy leveled off by 2011, but more alarming, disability-free life expectancy at age 65 has been falling since 2011. Between 2005 and 2011, older women gained a full half year of good health and men gained 0.3 years. Since then, most of those gains have been lost.
Another result is that over a single year, there has been an 18 percent increase in the number of people who do not get the basic help with their activities of daily life that they need.

The authors of the study conclude that the “massive reduction in publicly funded social care has had a severe impact on older people, families, and carers.” Five years of cutbacks have led to a 26 percent increase in the number of older people with unmet needs for care and support. And this is in a country where there is a lower rate of obesity and fewer chronic diseases per person than in the US.

What’s particularly interesting is that the UK has for years devoted far more resources to social support for older people than has the US. The possibility that the mediocre or downright poor health outcomes for Americans (despite a per capital medical expenditure of more than double that of other developed countries) is attributable to lack of spending on social services was first raised by Elizabeth Bradley at Yale. She found intriguing evidence that the added dollars lavished on physician care, hospital care, and diagnostic tests, among other outlays, were not nearly as valuable as the money spent on supporting caregivers and home care. And a recent RAND study, “Are Better Health Outcomes Related to Social Expenditures?” which was commissioned to challenge Bradley’s findings, instead confirmed them. Moreover, this analysis concluded that public social expenditures (as opposed to the private ones that are favored in the US) have a particularly strong relationship with health outcomes. It also found that certain social expenditures such as spending on old age care, translate into better health outcomes throughout the life cycle (ie support middle aged caregivers and they and their children are healthier). Finally, the study concluded that the role of social expenditures is magnified in countries with a high degree of income inequality—such as the US.

The US is on the brink of rolling back government programs. Presumably, what little support is currently provided to older people and their families is a candidate for the chopping block. The British experience shows us what sort of improvements in health and well-being are achievable for older people--and also what happens when social programs are cut. Caveat emptor!
-->

February 21, 2017

An Ounce of Prevention

Kaiser Health News, one of the best sources of reporting about issues affecting older people, ran a story last week about the re-emergence of “death panel” agitation. Most of us thought this non-issue was dead, but apparently Representative Steve King of Iowa has decided that the decision by CMS to reimburse physicians for advance care planning discussions should be euthanized by Congress. Accordingly, he has introduced a bill called “Protecting Life Until Natural Death” with the explicit goal of instructing CMS to stop paying for conversations about the end of life. Which is too bad, since CMS just reported than in the first six months of 2016 alone, close to 14,000 clinicians billed for such discussions for 223,000 patients.

The irony is that the very idea of discussions by patients and their families about how they wish the end of life to unfold was spurred by a concern that patients aren’t being allowed to die a “natural death.” Instead, they have been forced to endure a technological death, death on a ventilator, in an ICU, while iatrogenesis-inducing medication is pumped in. In fact, as Representative King may or may not be aware, some physicians and ethicists advocate substituting the phrase “allow natural death” for the still oft-misinterpreted “do not resuscitate.”

There’s another reason that the proposed legislation is misguided. While advance care planning conversations are often advocated as a means of avoiding unwanted medical intervention near the end of life, they are better characterized as preventive medicine. Enabling people to talk about what matters to them and how they wish to be treated if they are very ill, approaching the end of life, and unable to speak for themselves, has the potential to ensure that patients are neither over-treated nor under-treated. It gives them the opportunity to state clearly and unambiguously that they would want to be put on a ventilator if they develop respiratory failure in the setting of advanced emphysema, however small the likelihood that they will be able to be weaned from the machine. It gives them the chance to say explicitly that they would want to be maintained with a feeding tube if they are in a persistent vegetative state, even if there is no chance of ever emerging from that condition.

What advance care planning does is to enhance patient choice. It doesn’t give government –or physicians, or health care surrogates, or families—the right to decide what treatment a patient will receive when he or she is dying. It assures that patients will make their own decisions about what kind and how much medical treatment they want. Surely that’s what Representative King wants for himself.

February 12, 2017

The Price of Tom Price

The Senate confirmed Tom Price (R-Georgia) by a 52-47 vote as the new Chief of the Department of Health and Human Services this week. Much of the debate focused on Price’s ethically and legally dubious stock purchases. He bought stock in a medical device company--and then promptly authored a bill to increase Medicare reimbursements for that company’s products. Attention to Price’s many apparent conflicts of interest are important but should be taken up by the SEC as part of an investigation of insider trading. Unfortunately, with all the attention paid to financial shenanigans, there was correspondingly less attention paid to what Tom Price would try to do to Medicare and Medicaid.

In fact, there’s a great deal of speculation about what Tom Price believes or would do, and less reliable information about what he wants to do. What we do know is that he is an orthopedic surgeon (one of the medical device companies he invested in, and which stands to benefit from legislation he favors, is Zimmer, a leading manufacturer of artificial hips and knees) who strenuously dislikes the recently introduced “bundling” of payments for joint replacement surgery under Medicare. According to this plan, which so far seems to be lowering costs without adversely affecting quality, Medicare pays a single amount for all care involved in replacing a hip or knee: hospital care, the surgery itself, and post-surgical care for 90 days. Providers whose care costs less than the target amount stand to be paid a bonus and those whose care care exceeds the target amount are hit with a penalty. Programs such as this one are piloted by the Center for Medicare and Medicaid Improvement, an agency authorized by the Affordable Care Act--and Price has specifically tried to de-fund the CMMI.

What we know is that Price was one of the authors of “A Better Way,” the House Republican outline for replacing the ACA. This document strongly favors “premium support,”  a voucher program that would give patients a fixed amount of money with which to purchase a (private) health insurance plan. While this might simply be what Medicare already does with respect to Medicare Advantage programs, the current alternative to fee-for-service Medicare, it raises the question both of whether the vouchers could be used to buy a conventional Medicare plan and also how much control CMS would have over what must be included in eligible plans. 

We know that Price favors repeal of the ACA, which provides for free coverage of preventive services such as colon and breast cancer screening, and which has reined in Medicare costs by reducing payments to hospitals, skilled nursing facilities, and Medicare Advantage plans. Undoing the ACA has the potential to reverse all these trends. 

Finally, we know that Price is in favor of converting Medicaid to a block grant program—essentially turning it over to the states. Medicaid already demonstrates enormous state to state variation, with the contribution and standards of the federal government standing between a robust insurance plan and a total farce in states such as Alabama and Mississippi. Right now, 9 million of the 46 million Medicare enrollees are dually eligible—they receive both Medicaid and Medicare.

A far larger proportion of older, eligible voters go to the polls on election day than any other group. In 2016, voter turnout among the 65+ set was close to 60 percent; among those 18-29, it was under 20 percent. 


Older people count in the eyes of our elected officials, if for no other reason than that they vote. Maybe those enrolled in Medicare didn’t realize that a Trump administration would mean for them. But with the appointment of Tom Price, we know a little more. It's time for older people to speak up for Medicare.

February 06, 2017

The Last Stop

The United Kingdom is, in many respects, ahead of the United States in its approach to both geriatrics and palliative care. Cicely Saunders established the first modern hospice in London in 1967; the US did not open its first hospice until 1974, after Florence Wald spent a year at the St. Christopher’s Hospice in England to study under Saunders. 

While the US boasts that Dr. Ignatz Nascher—himself an immigrant from Austria—coined the term “geriatrics in 1911, Nascher is not exactly a stellar role model. He wrote in his textbook, “Geriatrics: Diseases of Old Age and their Treatment,” that “We realize that for all practical purposes the lives of the aged are useless, that they are often a burden to themselves, their family and the community at large. Their appearance is generally unesthetic, their actions objectionable, their very existence often an incubus to those who in a spirit of humanity or duty take upon themselves the care of the aged.” Far more attractive a founding figure is Britain’s Marjory Warren, who created the first geriatric units in English hospitals in the 1940 and whose work led the National Health Service to recognize geriatrics as a specialty in 1947. The US medical establishment only came to see geriatrics as worthy of recognition four decades later—and instead of awarding the field specialty status, chose starting in 1988 to allow physicians to receive a “Certificate of Added Qualifications in Geriatrics,” something less than full-fledged accreditation. The gap between the UK and the US remains to this day. So when the British report a study of the factors associated with whether people die in hospital or at home, it’s worth heeding their findings.

In both England and the US, most people who are asked where they would prefer to die say they want to be at home. Where people actually die is quite different. In England, 58 percent of people die in hospital and 18 percent at home. In the US in 2007, 24 percent of people over 65 died at home, up from 15 percent in 1989. The main change in the last decade, however, has been an increase in deaths in the nursing home: hospital deaths went from 38 percent to 35 percent, but nursing home deaths from 5 to 28 percent.

But England tried to do something about the discrepancy. England adopted the “End of Life Care Strategy” in 2008 to improve care in the final year of life and to prioritize home over hospital care. The new study examines what happened to patients dying of respiratory disease between 2001 and 2014.  What they found was that among the 334,520 people who died of chronic obstructive pulmonary disease and the 45,712 who died of interstitial lung disease, hospital death fell by 6 and 3 percent, respectively, after the introduction of the End of Life Strategy. In the several years before the strategy was initiated, the proportion of pulmonary deaths occurring in the hospital had remained constant.  But the improvements were wiped out for people who had multiple co-morbid conditions. And living in a city, especially London, lower socioeconomic status, and being married, also increased the likelihood of dying in the hospital.

Another study, this one from Belgium, may shed some light on why it was so hard to enable people with chronic respiratory conditions, assorted co-morbidities, and limited resources out of the hospital. This study of family physicians, nurses, and family caregivers used focus groups and semi-structured interviews to figure out the pluses and minuses of hospital care. They identified the usual weaknesses of the hospital: inadequate expertise in symptom management, an excessive focus on curative care or on life-prolongation, and poor communication. But they also revealed that for many people, the acute hospital is a safe haven. It is a place that offers hope even to people who acknowledge that they are terminally ill. It provides continuous support and peace of mind. And it is a place of last resort for people whose families are having difficulty caring for them at home.

As my colleague Jim Sabin and I argued a few years ago in our paper “No Place Like the Hospital,” what people say they want (ie to die at home) when they are perfectly healthy may be quite different from what they actually want when they are seriously ill and imminently dying. It’s not surprising that the more complicated their medical problems and the more constrained their financial and familial resources, the more attractive the hospital seems. But with the growth of inpatient palliative care consultative services—67 per cent of American hospitals now boast such a program —in-hospital care is improving. The findings of the Belgian study, with inadequate pain management, poor communication, and excessive attention to life-prolonging therapy, are no longer universally applicable.

To improve care at the very end of life, we need to do a better job in both the home and the hospital setting. In both cases, what is needed is a potent injection of palliative care expertise. If care is in the hospital, the family physicians, specialists, and nurses providing treatment should be advised by palliative care specialists. If care is in the home, family caregivers should have the support and resources of a sophisticated palliative care team. The issue is not so much moving care from one site to another as optimizing care in each location.

January 30, 2017

Luck and Genes

My mother’s friend Lixie died last month. Eight months ago, her husband (my father) died. And just about exactly a year ago, my mother’s friend Walter died.

The three of them were all in their 90’s: Lixie died 6 weeks after turning 92; my father also died 6 weeks after turning 92; Walter died 6 weeks before he would have been 92. My mother, who still lives independently though she is not as vigorous as she was a few years ago, reached age 91 in December.

They had something else in common: all three were born in Germany or Austria in the 1920s and left thanks to the efforts of a group of Belgian Jewish women who sought to rescue Jewish children from an uncertain fate. The group of 93 children stayed in Brussels until the Germans invaded Belgium. They then made their way to unoccupied France, where they found refuge until 1942, when France no longer provided a safe haven for them. My parents escaped individually to Switzerland and eventually, well after the end of the war, made their way to the US. Lixie remained in hiding in France until the end of the war. Walter was one of the few teenagers to manage to immigrate to the US during the war. The story of the “Children of La Hille” is told by Walter in a book published shortly before his death; I tell parts of the story in my memoir about my parents, Once They Had a Country

Of the 93 children in the original group that made their way to Brussels, 82 survived the war.  And of those 82, many are living into their nineties. In addition to the four I mentioned above—my mother and the three who died within the past year—I know of another three who are alive and over ninety. There may be more. Surely this is more than one would expect in a cohort of people born in Europe in the mid-1920s.

Curious, I looked at what is known about the longevity of Jews who survived the trauma of 1939-1945 in Europe. And what I found was very interesting indeed. An article called Against All Odds found that survivors of “genocidal trauma” during World War II were likely to live longer than a comparable group not exposed to the same trauma.

The study looked at Israelis born in Poland who were between 4 and 20 years of age in 1939. They compared those who came to Israel before 1939 with those who arrived between 1945 and 1950, defining as "Holocaust survivors" anyone who spent the war years in Europe, regardless of whether they were in a concentration camp, hiding in a convent, or on the run. The justification for this broad definition is that in all cases, their lives were in extreme jeopardy. 

The authors of the study examined at the experience of 41,454 Holocaust survivors and 13,766 controls. What they found was that Holocaust survivors were on average likely to live 6.5 months longer than those who were not in Europe during World War II. This despite ample prior evidence that Jews who spent some or all of the war years in Europe had a high rate of post-traumatic stress disorder in later life.

What does this mean? It’s not certain what it means, but one possibility is that whatever factors led this high risk group to survive under adversity also led them to survive into old age. And since there’s no reason to believe that just because you were lucky once, you’ll be lucky again, I suspect that a key factor is genes. Those Jewish children who managed to survive the war, including the Children of La Hille (who, because of the assistance they received, faced better odds than their counterparts who were not part of this group), were better equipped to endure. That capacity continued to help them for the remainder of their lives.

This explanation is, of course, entirely speculative. It’s conceivable that the longevity of the Children of La Hille is simply due to chance. But I am telling this story because it is a reminder that much of the experience of aging is shaped to a large extent by factors beyond our control—by luck and genes. 

This doesn’t mean we shouldn’t try to improve our chances of survival by preventing whatever part of illness and disability is preventable. It doesn’t mean we shouldn’t do what we can by exercising and eating a good diet, by avoiding drugs and alcohol, and by controlling conditions such as high blood pressure. But let’s have the humility to remember that we have only a modest ability to determine our fate. All those who, unlike the Children of La Hille, don't have good luck and good genes, should nonetheless have access to the medical care, housing, and social services that allow them to have as good a quality of life as possible, however many years they live.

-->

January 22, 2017

To the Barricades!

On this weekend of the Women’s Marches—175,000 of us marched in Boston alone—it’s fitting to remember that aging is predominantly a women’s issue. Robert Butler, the founding father of contemporary geriatrics, made the point powerfully and persuasively in a short article in the New England Journal of Medicine in 1996, "On Behalf of Older Women—Another Reason to Protect Medicare and Medicaid." Sadly, the observations and concerns he raised 20 years ago are exactly the ones we face today as President Trump nominates Tom Price, foe of Medicare and Medicaid, to serve as head of the Department of Health and Human Services, and Paul Ryan, Speaker of the House, hopes to finally succeed in carrying out his long-standing goal of privatizing Medicare.

Butler begins by saying that old age is a territory populated largely by women.” Updating the data he presents: life-expectancy at age 65 is 17.9 years for men and 20.5 years for women, which means that women typically outlive men by at least 2.5 years. Since death rates are higher for men throughout much of the lifecycle, this means there are currently 25.1 million older women in the US, compared to only 19.6 older men. The ratio of men to women falls with age: in the 65-74 year old bracket, there are 86.9 men for every 100 women; among those over age 85, there are only 48.3 men for every 100 women.

Butler continues: “Proposals to curtail Medicare and Medicaid, if enacted, could leave beneficiaries, the majority of whom are women, paying more out of pocket for what may be less medical care.” He reminds us that the concern about Medicare and Medicaid have arisen “because political leaders want to balance the federal budget…while giving some Americans a tax cut,” not because of concern about quality of care. His words could have been written today instead of 20 years ago. And alas, older women are apt to live in poverty today, just as was the case when Butler wrote: the median income of older people in 2013 was $29,327 for men—but only $16,301 for women. Put differently, 6.6 percent of older men live below the poverty line, compared to 11 percent of women.

The theme of aging as a women’s issue was picked up by acerbic social commentator Susan Jacoby in her 2011 book, Never Say Die: the Myth and Marketing of Old Age. She points out that the household income of women is cut in half when their husbands die. Unequal pay for equal work has a cumulative effect: pensions are lower for women. Women who take time out of work to raise a family are rarely able to compensate for the loss of wages, seniority, and missed promotions. Because women typically live longer than men, they are more likely to become frail, to develop dementia, and to be widowed. As a result, fully two-thirds of nursing home residents are female. And the issues that affect older women in general affect older women of color in spades.

As we pressure the government to preserve reproductive health rights, to institute equal pay for equal work, and to enforce laws that prohibit discrimination based on sexual orientation, we should also pressure government to maintain and improve health care for older women. That means protecting Medicare and Medicaid, subsidizing supportive housing, and assisting family caregivers--at last report, there were 34.2 million Americans providing unpaid care for an adult over age 50 and two-thirds of these caregivers were themselves over 65. So to the barricades!

-->