Health

A Prescription For the Health Care Crisis

With all the shouting going on about America’s health care crisis, many are probably finding it difficult to concentrate, much less understand the cause of the problems confronting us. I find myself dismayed at the tone of the discussion (though I understand it—people are scared) as well as bemused that anyone would presume themselves sufficiently qualified to know how to improve our health care system best simply because they’ve encountered it when people who’ve spent entire careers studying it (and I don’t mean politicians) aren’t sure what to do themselves.

A Prescription For the Health Care Crisis 1

Albert Einstein is reputed to have said that if he had an hour to save the world, he’d spend 55 minutes defining the problem and only 5 minutes solving it. Unfortunately, our health care system is far more complex than most who are offering solutions admit or recognize. Unless we focus most of our efforts on defining its problems and thoroughly understanding their causes, any changes we make are just likely to make them worse as they are better.

Though I’ve worked in the American health care system as a physician since 1992 and have seven year’s worth of experience as an administrative director of primary care, I don’t consider myself qualified to thoroughly evaluate the viability of most of the suggestions I’ve heard for improving our health care system. I do think. However, I can contribute to the discussion by describing some of its troubles, taking reasonable guesses at their causes, and outlining some general principles that should be applied in attempting to solve them.

THE PROBLEM OF COST

No one disputes that health care spending in the U.S. has been rising dramatically. According to the Centers for Medicare and Medicaid Services (CMS), health care spending is projected to reach $8,160 per person per year by the end of 2009 compared to the $356 per person per year in 1970. Moreover, this increase occurred roughly 2.4% faster than the increase in GDP over the same period. Though GDP varies from year to year and is, therefore, an imperfect way to assess a rise in health care costs in comparison to other expenditures from one year to the next, we can still conclude from this data that over the last 40 years, the percentage of our national income (personal, business, and governmental) we’ve spent on health care has been rising.

Despite what most assume, this may or may not be bad. It all depends on two things: the reasons why spending on health care has been increasing relative to our GDP and how much value we’ve been getting for each dollar we spend.

WHY HAS HEALTH CARE BECOME SO COSTLY?

This is a harder question to answer than many would believe. First, the rise in the cost of health care (on average 8.1% per year from 1970 to 2009, calculated from the data above) has exceeded the rise in inflation (4.4% on average over that same period), so we can’t attribute the increased cost to inflation alone. Health care expenditures are closely associated with a country’s GDP (the wealthier the nation, the more it spends on health care), yet even in this, the United States remains an outlier (figure 3).

Is it because of spending on health care for people over the age of 75 (five times what we spend on people between 25 and 34)? In a word, no. Studies show this demographic trend explains only a small percentage of health expenditure growth.

Is it because of monstrous profits the health insurance companies are raking in? Probably not. It’s admittedly difficult to know for certain as not all insurance companies are publicly traded and therefore have balance sheets available for public review. But Aetna, one of the largest publicly traded health insurance companies in North America, reported a 2009 second-quarter profit of $346.7 million, which, if projected out, predicts a yearly profit of around $1.3 billion from the approximately 19 million people they insure. If we assume their profit margin is average for their industry (even if untrue, it’s unlikely to be orders of magnitude different from the average), the total profit for all private health insurance companies in America, which insured 202 million people (2nd bullet point) in 2007, would come to approximately $13 billion per year. Total health care expenditures in 2007 were $2.2 trillion (see Table 1, page 3), which yields a private health care industry profit of approximately 0.6% of total health care costs (though this analysis mixes data from different years, it can perhaps be permitted as the numbers aren’t likely different by any order of magnitude).

Read More Articles :

Is it because of health care fraud? Estimates of losses due to fraud range as high as 10% of all health care expenditures, but it’s hard to find hard data to back this up. Though some percentage of fraud almost certainly goes undetected, perhaps the best way to estimate how much money is lost due to fraud is by looking at how much the government actually recovers. In 2006, this was $2.2 billion, only 0.1% of $2.1 trillion (see Table 1, page 3) in total health care expenditures for that year.

Is it due to pharmaceutical costs? In 2006, total expenditures on prescription drugs were approximately $216 billion (see Table 2, page 4). Though this amounted to 10% of the $2.1 trillion (see Table 1, page 3) in total health care expenditures for that year and must therefore be considered significant, it remains only a small percentage of total health care costs.

Is it from administrative costs? In 1999, total administrative costs were estimated to be $294 billion, a full 25% of the $1.2 trillion (Table 1) in total health care expenditures that year. This was a significant percentage in 1999, and it’s hard to imagine it’s shrunk to any significant degree since then.

In the end, though, what probably has contributed the greatest amount to the increase in health care spending in the U.S. are two things:

1. Technological innovation.

2. Overutilization of health care resources by both patients and health care providers themselves.

Technological innovation. Data that proves increasing health care costs are due mostly to technological innovation is surprisingly difficult to obtain, but estimates of the contribution to the rise in health care costs due to technological innovation range anywhere from 40% to 65% (Table 2, page 8). Though we mostly only have empirical data for this, several examples illustrate the principle. Heart attacks used to be treated with aspirin and prayer. Now they’re treated with drugs to control shock, pulmonary edema, and arrhythmias, as well as thrombolytic therapy, cardiac catheterization with angioplasty or stenting, and coronary artery bypass grafting. You don’t have to be an economist to figure out which scenario ends up being more expensive. We may learn to perform these same procedures more cheaply over time (the same way we’ve figured out how to make computers cheaper). Still, as the cost per procedure decreases, the total amount spent on each procedure goes up because the number of procedures performed goes up. For example, laparoscopic cholecystectomy is 25% less than the price of an open cholecystectomy, but the rates of both have increased by 60%. As technological advances become more widely available, they become more widely used, and one thing we’re great at doing in the United States is making technology available.

Overutilization of health care resources by both patients and health care providers themselves. We can easily define overutilization as the unnecessary consumption of health care resources. What’s not so easy is recognizing it. Every year from October through February, the majority of patients who come into the Urgent Care Clinic at my hospital are, in my view, doing so unnecessarily. What are they coming in for? Colds. I can offer support, reassurance that nothing is seriously wrong, and advice about over-the-counter remedies—but none of these things will make them better faster (though I often can reduce their level of concern). Further, patients have a hard time believing the key to arriving at a correct diagnosis lies in history gathering and careful physical examination rather than technologically-based testing (not that the latter isn’t important—just less so than most patients believe). Just how much patient-driven overutilization costs the health care system is hard to pin down as we have mostly only anecdotal evidence as above.

Further, doctors often disagree among themselves about what constitutes unnecessary health care consumption. In his excellent article, “The Cost Conundrum,” Atul Gawande argues that regional variation in the overutilization of health care resources by doctors best accounts for the regional variation in Medicare spending per person. He argues that if doctors could be motivated to rein in their overutilization in high-cost areas of the country, it would save Medicare enough money to keep it solvent for 50 years.

A reasonable approach. To get that to happen, however, we need to understand why doctors are overutilizing health care resources in the first place:

1. Judgment varies in cases where the medical literature is vague or unhelpful. When faced with diagnostic dilemmas or diseases for which standard treatments haven’t been established, a variation in practice invariably occurs. For example, if a primary care doctor suspects her patient has an ulcer, does she treat herself empirically or refer to a gastroenterologist for an endoscopy? If certain “red flag” symptoms are present, most doctors will refer. If not, some would, and some wouldn’t, depending on their training and the intangible exercise of judgment.

2. Inexperience or poor judgment. More experienced physicians tend to rely on histories and physicals more than less experienced physicians and consequently order fewer and less expensive tests. Studies suggest primary care physicians spend less money on tests and procedures than their sub-specialty colleagues but obtain similarly and sometimes even better outcomes.

3. Fear of being sued. This is especially common in Emergency Room settings but extends to almost every area of medicine.

4. Patients tend to demand more testing rather than less. As noted above. And physicians often have difficulty refusing patient requests for many reasons (e.g., wanting to please them, fear of missing a diagnosis and being sued, etc.).

5. In many settings, overutilization makes doctors more money. As a result, doctors have no reliable incentive to limit their spending unless their pay is capitated or receiving a straight salary.

Gawande’s article implies there exists some level of utilization of health care resources that’s optimal: use too little, and you get mistakes and missed diagnoses; use too much, and excess money gets spent without improving outcomes, paradoxically sometimes resulting in outcomes that are actually worse (likely as a result of complications from all the extra testing and treatments).

How can we get doctors to employ uniformly good judgment to order the right number of tests and treatments for each patient–the “sweet spot”–to yield the best outcomes with the lowest risk of complications? Not easily. There is, fortunately, or unfortunately, an art to good health care resource utilization. Some doctors are more gifted at it than others. Some are more diligent about keeping current. Some care more about their patients. An explosion of studies of medical tests and treatments has occurred in the last several decades to help guide doctors in choosing the most effective, safest, and even cheapest ways to practice medicine. Still, the diffusion of this evidence-based medicine is a tricky business. For example, beta-blockers have been shown to improve survival after heart attacks don’t mean every physician knows it or provides them. Data clearly show many don’t. How information spreads from the medical literature into medical practice is a subject worthy of an entire post unto itself. Unfortunately, getting it to happen uniformly has proven extremely difficult.