Newspaper logo


What's Wrong with U.S. Health Care, and How We Can Make Things Right

We must change the chaotic methods by which Americans pay for health care.

by Neil A. (Tony) Holtzman, M.D., M.P.H.
12 May 2009
Of all health care spending in the US, 13 per cent comes from out-of-pocket payments from patients' taxable incomes, whereas the employer’s health benefit is not taxed either for the employee or the employer.
The United States spends a greater share of its wealth on health care than any other nation. Yet 22 developed countries have longer life expectancy at birth and 25 have lower infant mortality rates. We also have among the highest death rates due to conditions that could be prevented by timely and effective health care. In these and other measures we have fallen further behind over the last ten years. Our total expenditure for health per capita is almost 1.5 times that of Norway, the next most expensive nation. Per capita, the money we pay out privately exceeds the total expenditures of most other countries, where most health care is publicly funded.

Why are we paying so much and getting so little? Most discussions dwell on the payment side, and I too will begin by examining how our health care is paid for. What is missing is a consideration of additional key elements: how charges for medical care are established; and how care is provided. Attending to these issues will be crucial for the success of any new initiatives to both improve care and reduce costs.

Medicare, the only health care to which any Americans are "entitled," pays a large portion of care for those who are age 65 and over. For two other programs to which state and federal governments contribute, Medicaid and S-CHIP, people and families must demonstrate they are poor enough to qualify. They contribute to the costs on a sliding scale based on income. There are no eligibility requirements for government-supported Community Health Centers but users contribute on a sliding scale as well. More centers are needed as both Presidents Bush and Obama have recognized.

Employment-based insurance covers three-fifths of the population under 65. Employers cannot refuse to hire someone because of disability and, once hired, workers cannot be denied the health benefit offered by their employer. Increasingly, however, they have higher deductibles as well as co-payments. Private insurance companies often administer employer-based group plans. Employers of a small number of workers (usually less than twenty) generally cannot afford to provide health benefits so their workers must seek individual insurance coverage elsewhere or remain uninsured. At least three-quarters of the uninsured have jobs but are not offered health insurance by their employers. Individual insurance policies are available to them and to self-employed people. About 15% of the population, or 46 million people, did not have health insurance in 2007. The number is growing larger.

Why are health care costs so high? Medicare and the other government programs, as well as private insurers, have to pay at least part of the prices charged by hospitals, health care providers and, if there is a drug benefit, by pharmaceutical companies.

In the majority of employer plans, providers bill the plan per service performed, whether it is a visit to a physician, or a hospital stay. As the employer has put up a fixed, equal amount for each employee, the administrators must first decide whether the services for which the plan is billed are covered by the plan. When they are, they must estimate whether paying the entire amount billed for all services per year will exceed the funds the employer has budgeted for health care. If they are likely to exceed, employees will pay out-of-pocket. (Of all health care spending in the US, 13 per cent comes from out-of-pocket payments. Out-of-pocket expenses are taxable whereas the employer’s health benefit is not taxed either for the employee or the employer.) With plans differing in their benefits package, considerable administrative costs are incurred deciding whether a bill for service is justified, and if so, how much to pay for it. Private insurers must also decide, based on past experience, how much an employer should be billed for the coming year. The process is much simpler, and consequently, less costly, under Medicare.

A minority of employers prepay health maintenance organizations (HMOs) to provide services for each employee. To keep within its budget, the HMO may pay its physicians a salary or an annual fixed amount (capitation) for each patient they care for. Administrative costs are lower.

Private individual insurance is more expensive than large employer plans because the pool of insureds is smaller and the cost of those who become sick has to be paid for by higher premiums. To accomplish this the insurance company hires doctors to examine applicants and underwriters to determine whether coverage should be denied or surcharged. Although expensive, insurance companies go to this extreme to keep premiums down for healthy buyers and their profits up.

A public plan that offered all people not entitled to Medicare a comprehensive package of health benefits at a uniform premium and with free choice of a physician would be less costly than either employer or individual private health insurance.

When Medicare became the law, establishing the largest possible pool of older Americans, no private insurer could compete. A public plan that offered all people not entitled to Medicare a comprehensive package of health benefits at a uniform premium and with free choice of a physician would be less costly than either employer or individual private health insurance. About one-third of the uninsured are healthy young adults who would enlarge the size of the pool and lower the premium. As they and other healthy people joined the public plan, the premium could be reduced further and coverage could become universal, especially as the reforms to be discussed in the next parts were adopted, further lowering costs.

Eliminate Pecuniary Interest in Health Care

One-hundred years ago George Bernard Shaw wrote: “That any sane nation...should go on to give a surgeon a pecuniary interest in cutting off your leg, is enough to make one despair of political humanity.” We are still despairing a century later. The pecuniary interests of doctors and clinics amounted to 24% of the $2.2 trillion spent on health care services in the United States in 2007. An additional 31% went to hospitals, and 10% to prescription drugs. The remaining 35% was divided among nursing home, dental, administrative, research and other costs.

Most physicians are paid on a fee-for-service basis, just as in Shaw’s time. Today, you might think third-party payers have a say in setting prices for providers’ services. Nominally they do. However, Medicare and private third party payers who follow its lead rely heavily on the recommendations of a permanent committee of the American Medical Association (AMA); a majority of committee members are appointed by specialty societies. Remarkably, 90 percent of the committee’s recommendations on fee-for-service reimbursement become Medicare policies. If third party payers don’t reimburse providers for the amount allowed for a given service, the difference is passed on to patients in the form of co-payments.

In Shaw’s day, doctors contrived operations without a shred of evidence of their effectiveness. (Such surgery provides the backdrop of "Doctor’s Dilemma," Shaw’s play whose preface I quoted above.) That is less likely today, but in the interim we have witnessed a proliferation of expensive technologies whose use extends beyond situations for which they are proven safe and effective. Several medical societies protested when Medicare, citing lack of proven clinical utility, proposed to reduce the number of reimbursable uses of CT angiography. The pressure from the medical societies was so intense that Medicare backed down, boosting the income of those who perform this procedure. (In this case, private insurers were not always so lenient, denying coverage for some uses allowed by Medicare.)

Reimbursing doctors only for services that have the potential to contribute to improved health would reduce costs under a fee-for-service scheme. Comparative effectiveness studies proposed by President Obama, for which funds have been allocated in the American Recovery and Reinvestment Act, are needed to establish what works and what doesn’t.

Paying physicians salaries or giving them an annual lump sum for every patient they see on a regular basis (capitation) are alternatives to fee-for-service. The AMA castigated these prepayment methods, used after World War II by Group Health of Puget Sound and Kaiser Permanente on the west coast and Health Insurance Plan and Group Health Association on the east coast. Some physicians who participated in prepayment plans were expelled by their local medical societies. Some were denied admitting privileges at local hospitals. Nevertheless, such plans have grown in the form of HMOs.

Turning to hospitals: Some reduction in payments to hospitals resulted when the amount a hospital could bill for inpatients under Medicare was based on the particular diagnoses of its patients rather than on fees-for-services. Lengths of hospital stay were shortened, but hospitals expanded the use of outpatient services and same-day surgery. I know a Medicare patient who had a stent placed whose hospital bill came to $75,000 for an uncomplicated one-night stay. Another who had to have a fractured arm braced got a bill for $26,000 for an overnight hospitalization. People under 65 without insurance are forced into bankruptcy with bills like these.

Prominent hospitals, often university-affiliated, bill insurance companies higher rates for a given procedure than other hospitals. They get away with this because an insurance company that did not pay these higher rates would lose business to other companies that did.

Prominent hospitals, often university-affiliated, bill insurance companies higher rates for a given procedure than other hospitals. They get away with this because an insurance company that did not pay these higher rates would lose business to other companies that did. Yet the quality of care for routine problems, such as deliveries, hernias and pneumonia, is often no better at these high-priced institutions.

Turning to drugs: Prescription drugs in the United States cost, on average, about 20% per more than in other countries, many of which have price controls. During the Bush years, Congress forbade Medicare from negotiating lower drug prices. Physicians continue to prescribe patented name brands even after the patents expire and lower-priced generics become available.

Along with prices the use of prescription drugs continues to increase, but it is not clear that all of the increase is justified. Although the Food and Drug Administration (FDA) must decide that a drug is safe and effective for its intended use before it can be marketed, many drugs are prescribed for purposes that the FDA has not examined, so-called off-label uses. If the drugs are not safe and effective for these other purposes, which is often the case, spending on prescription drugs increases unnecessarily and people can be harmed. Congress should strengthen FDA’s ability to evaluate new drugs and devices, and their surveillance after marketing. Off-label use should be limited to clinical problems for which there is no FDA-approved alternative and only when patients are informed that their safety and effectiveness have not been established.

Should a public plan be enacted and should polices be adopted that lower drug, hospital, and physician prices, care in the United States would still be more expensive and less effective than in many other countries.

Primary Care Should Be Our Primary Health Care Goal

The more a country’s health care system is oriented towards primary care the lower its death rates, years of potential life lost—and its health care expenditures. In a primary care system, people can freely choose the physician who is their first contact for all health problems and who provides continuing care. The physician, consequently, knows the patient better than specialists who, by definition, are trained to deal with only a limited number of potential problems. The primary care provider in conjunction with the patient decides when specialty referrals are needed. Knowing the “whole” patient permits the primary care provider to tailor-make preventive medicine to individual patients.

The United States is moving in the opposite direction. From 1997 to 2005, the number of US medical school graduates entering family medicine dropped by 50%. The percentage of internal medicine residents who planned careers in primary care rather than in subspecialties dropped from 54% to 25% from 1998 to 2004. Two factors account for these trends.

First, specialists dominate the training of physicians in most American medical schools, attracting students to the specialties. This has its roots early in the 20th century, when internships and residencies became a fixture of training to assist specialists in the care of hospitalized patients. Further expansion occurred after World War II. These trainees swelled the specialists’ ranks. The large investment the Federal government has made to medical research in American medical schools since World War II also contributes unintentionally to physician specialization, as the fruits of much of that research have fallen into the hands of specialists. Of course, specialists are necessary, but we use them excessively, with greatly increased costs.

Second, primary care physicians earn much less than specialists, making it more difficult for them to pay off the debts they incurred to go to medical school. The gap between the earnings of specialists and primary care physicians, which has widened recently, is due in part to the fee-for-service reimbursement scheme used by Medicare. With strong input from an AMA Committee dominated by specialists, Medicare’s “relative value scale,” often used by other third party payers as well, pays more for procedures, which are usually performed by specialists, than for an equivalent time spent talking with and examining patients. Ferreting out problems and keeping track of them, the primary care provider has a much larger list of potential causes for a patient’s problems that must be sorted out than most specialists, whose diagnostic acumen is limited to a subset of problems. A patient who self-refers to a specialist without the intervention of a primary care provider may have a symptom or complaint that sometimes falls under the specialist’s expertise, but sometimes doesn’t. When it doesn’t, the patient’s time is wasted at best and money is spent needlessly. At worst, the patient is wrongly treated and may be harmed.

Specialists are sued more often for malpractice than primary care providers. To protect themselves, some specialists make excessive use of tests and procedures in hope of reducing their chances of being sued. This “defensive medicine” increases expenditures for medical care, and sometimes causes harm. In addition to lowering the frequency of medical errors—the third leading cause of death in the United States (higher than in other countries)—greater emphasis on primary care would reduce the frequency with which unnecessary tests and procedures are performed. It would be more effective than putting a cap on malpractice settlements.

Unlike several other countries, the federal government does little to influence the types of future physicians our medical schools train. You can be sure the tradition-laden medical schools and specialist societies will oppose a change in priorities or in altering reimbursement policies in favor of primary care providers. Yet such moves will improve the health status of Americans and reduce the costs of care. Until we have an adequate supply of primary care physicians, nurse practitioners and physician assistants trained in primary care should be given greater responsibility in geographical areas where primary care physicians are in short supply. A uniform and portable electronic health record that will facilitate communication among primary care providers and specialists and will reduce unnecessary duplication of procedures will make care more efficient.

If health care in United States continues in the same direction, costs will continue to increase and fewer people under 65 will be covered by insurance. This will be exacerbated by the current economic crisis with growing unemployment and loss of health benefits. (Extension of benefits through COBRA is only for 18 months after job loss, and ex-workers have to pay the entire premium.) Without insurance, and also with higher deductibles and co-payments among the insured, people delay seeking care. Left untreated, more illnesses will progress to the point where expensive treatment and hospitalizations are needed and outcomes are poorer. We will fall further behind countries where everyone is insured. In addition, disparities in health between the richest of our citizens, almost all of whom have health insurance, and those poor who do not qualify for Medicaid or S-CHIP and are uninsured, will increase.

Only by changing the chaotic methods by which Americans pay for care—chaos characterized by unnecessary spending—and orienting toward primary care will the United States succeed in lower costs while improving the health of its people.

This article was originally published in three installments in the Las-Cruces Sun-News on April 20, 27, May 7, 2009, and is republished in the Baltimore Chronicle with permission of the author, a Hopkins physician who now resides primarily in New Mexico.

Copyright © 2008 The Baltimore News Network. All rights reserved.

Republication or redistribution of Baltimore Chronicle content is expressly prohibited without their prior written consent.

Baltimore News Network, Inc., sponsor of this web site, is a nonprofit organization and does not make political endorsements. The opinions expressed in stories posted on this web site are the authors' own.

This story was published on May 12, 2009.