One of the hardest things to get a handle on is the difference between the lives we lead and those of our parents and grandparents — until, by accident, you see it a movie. Take “Seabiscuit,” for example.
Director Gary Ross intersperses his otherwise-quite-riveting tale of a come-from-behind horse in the Great Depression with newsreel film of men and women in the first third of the 20th century. You can see in their gaunt faces the realism with which they calculate life’s odds.
What’s the difference between the actor Toby Maguire and the jockey whom he plays in the film?
About thirty years of life expectancy at birth. (That’s why the film’s juxtaposition of the actors and the documentary footage is so jarring.)
What’s likely to be the difference between the present-day McGuire and the next generation? That is the really interesting question.
You won’t be surprised to hear that — in some very complicated way — it will depend on what it costs.
The contribution of improved health care to living standards in the 20th century may be widely understood in visceral terms like that of the movie, but it is little reflected in the national income accounts.
Instead, what is most visible in everyday statistics is the growth of health-care spending — from around 5 percent of GDP in 1960 to around 14 percent today and rising.
Often that gain is attributed to something called “cost disease” ever since economist William Baumol diagnosed it in a famous 1967 paper (“Macroeconomics of Unbalanced Growth: The Anatomy of Urban Crisis”).
The idea is doctors and nurses aren’t much more productive that they were forty years ago — nor are teachers, entertainers, policemen, auto repairmen or fiddlers in string quartets. Some activities just can’t be made more productive. We are doomed to pay some workers more and more for the same amount of work.
But “cost disease” is mostly bunk — because it relies on measures of input prices instead of output prices. The cost of playing a Mozart quartet may not have changed much since the piece was written. The cost of hearing it played has plummeted to nearly zero, thanks recording and telecommunications technology. Much the same is true of health care: great as are the resources we put into it, the value of what we take away is much, much more.
A few years ago University of Chicago economists Robert Topel and Kevin M. Murphy organized a conference on the economics of improving health. (The papers have just appeared as a book.) Yale economist William Nordhaus made the star contribution.
Using survey-established values for an additional year of life (how much consumption would you give up to live another year? — $2,600 or so is a common answer), the unmeasured gains in life expectancy were pretty much as great as all the measured 20th century growth in non-health goods and services. In other words, we were twice as well off as we thought we were.
His finding gave extra oomph to Murphy and Topel’s finding at the conference that, if anything, the US was spending too little on medical research, even though it is spending five times as much as Europe ($18.4 billion in 2000, compared to $3.7 billion in all of Europe.)
Given that just a 10 percent reduction in deaths from cancer and heart disease alone would add $10 trillion to national wealth — the equivalent of a year’s GDP — they argued that even high-priced drugs probably were repaying their development costs many times over.
It’s not that there isn’t plenty of waste in the health care system — everybody knows there is. But as long as you accept that increasing longevity is an economic good worth paying for, then the medical-industrial establishment is paying its way, and then some. It’s technological advance — and public health, and diet, and nutrition — that’s driving both the longevity and the increasing cost.
Still, there are these big differentials between the experiences of the US and other industrial nations. For example, the US and the United Kingdom both saw big changes in life expectancy between 1960 and 1997 — about 7 years in each case. But the fraction of spending on health care rose by 8 percentage points in the US and only about 3 percent in the UK. What was that all about?
Charles Jones of the University of California at Berkeley has devised a model that provides considerable insight. Citing a careful survey by Harvard’s Joseph Newhouse that attributed the bulk of the rise in health expenditures to the increased capabilities of medicine, Jones argues that health expenditures were low by default until the new technologies were discovered — MRIs, arthroscopic surgery, antibiotics, angioplasty and various psychotropic drugs.
Then the Medicare transfer program in the United States came along at just the moment to ensure that the elderly could live as long as technically feasible. Thus Jones built a stripped-down model in which the tax rate is determined by technological progress — the greater it is, the larger the share of GDP devoted to healthcare.
It turns out that, when all the data are fit and snug, the health expenditure share at the end of life is what drives the model. There is fairly heavy subsidy of patients in the last few years of life. If it weren’t present, many persons would die sooner rather than later — some 30 percent of all Medicare spending is on patients in their last year of life.
Jones examines a range of possible values for the parameters of his model — including the forecast of an expert panel on the Medicare Trustees’ financial projections that health expenditures might climb to 25 percent of GDP in 2050 and 38 percent in 2075.
He concludes that society’s willingness to transfer more and more resource to persons near the end of life is the crucial factor. “If society decides to cap the transfer rate,” he writes, “those forecasts could be far from the mark.”
Indeed, underlying attitudes towards death itself may account for the differences in aggregate spending between the US and the UK experiences, Jones speculates. The US may have allowed technological considerations to determine its Medicare spending to this point, while perhaps the UK has not.
In any event, as much as 4 percent of US GDP today may be spent on patients in their last year of life. Yet large differences in aggregate spending bring only small gains in life expectancy, at least in the model. Once a cap on transfer to the high-cost elderly is reached, however, the share of health expenditure may cease to grow and may even shrink somewhat, as a “dilution effect” sets in, associated with rising life expectancy among low-cost healthy people,
Clearly there will be much tension over health care spending in the future. But there is considerable consolation to know that we’re getting something for our money in the form of better health and longer lives.
Remember, sixty years ago, around 50 percent of all Americans worked on farms. Today the figure is 2 percent. So who’s to say with any confidence whether the right figure for health care is 20 percent or 25 percent or even 30 percent? These things do move around.