The Fund supports networks of state health policy decision makers to help identify, inspire, and inform policy leaders.
The Milbank Memorial Fund supports two state leadership programs for legislative and executive branch state government officials committed to improving population health.
The Fund identifies and shares policy ideas and analysis to advance state health leadership, strong primary care, healthy aging, and sustainable health care costs.
Keep up with news and updates from the Milbank Memorial Fund. And read the latest blogs from our thought leaders, including Fund President Christopher F. Koller.
The Fund publishes The Milbank Quarterly, as well as reports, issues briefs, and case studies on topics important to health policy leaders.
The Milbank Memorial Fund is is a foundation that works to improve population health and health equity.
Sep 25, 2023
Sep 18, 2023
Sep 14, 2023
Back to The Milbank Quarterly
One really doesn’t have to read dense academic studies to know that something is going on with private health insurance, but it’s a comfort to know that the problem is real and measurable. Two recent reports—the first, the Commonwealth Fund’s Biennial Health Insurance Survey,1 and the second, by the Kaiser Family Foundation from its continuing analysis of health reform implementation2—point to the ongoing degradation of health insurance in the United States.
This raises 2 questions: Why is this happening? And what are the potential consequences that might emerge from the steady decline in Americans’ insurance coverage?
Commonwealth reported in its study that in 2003, 12% of survey respondents with full-year coverage were underinsured because they spent 10% or more of household income on out-of-pocket expenses, excluding the premium. (For low-income adults, the underinsurance threshold dropped to 5% or more of household income.) By 2012, the number considered underinsured had doubled to 20%, remaining steady through 2014. In 2014, 59% of underinsured adults had employer coverage; the rest were covered by individual policies (purchased either inside or outside the Affordable Care Act’s health insurance Marketplace) or through public insurance. Commonwealth found that rising deductibles were “increasingly a factor in underinsurance,” since for so many people, deductibles have become high relative to income. By 2014, the proportion of private health plans without a deductible had fallen to 25%, with 1 in 10 adults facing an annual deductible of $3,000, up from 1 in 100 in 2003. In 2014, 24 million adults met the definition of underinsured based on deductibles alone, with deductibles exceeding 10% of income (or 5% in the case of low-income people).
The financial exposure created by high deductibles is only the starting point for underinsurance, as other forms of cost-shifting begin to pile on: tiered copayment and coinsurance obligations that fall most heavily on the sick (such as the placement of all drugs for serious and costly conditions like cancer or HIV on the highest-cost specialty tier) (see http://avalere.com/expertise/life-sciences/insights/avalere-analysis-exchange-benefit-designs-increasingly-place-all-medication); uncovered but medically necessary care, such as vision exams and eyeglasses; necessary health care that is denied or excluded altogether, such as extra (or any) physical therapy sessions for children with cerebral palsy; and out-of-network treatments because necessary and timely in-network care is simply unavailable (a particular problem for children and adults with mental health conditions). In short, the health insurance industry has many arrows in its quiver for cutting back on coverage.
The Kaiser study, which examined insured people in the nongroup market as well as those insured through employer-sponsored plans, focused on whether people believed that their insurance coverage offered real protection against high medical costs. Kaiser found that nearly 2 in 5 (38%) of nongroup enrollees and more than 1 in 4 (28%) of people with employer-sponsored coverage felt vulnerable to high medical bills.
The picture that emerges from these studies and the experiences of millions of Americans can be summed up as an ongoing decline in insurance coverage and greater direct financial exposure to the cost of health care.
Is the Affordable Care Act to blame for this misery? Probably not, but it may be hastening the phenomenon—at least indirectly—as a result of the confluence of 3 fundamental policy choices embodied in the legislation. The first is the law’s failure to take on underlying health care costs directly. The second is the decision to significantly limit the size and scope of premium subsidies and cost-sharing reduction assistance available for health plans purchased in the Marketplace, even though the buyers would be low- and moderate-income families. This scenario in turn led insurers to design plans with a hefty use of deductibles and cost-sharing in order to hold down premiums. Finally, the third is the so-called Cadillac tax, which imposes a 40% excise tax on employer plans that exceed a certain threshold, and has triggered an employer land-rush to reduce the actuarial value of their plans (most easily done through high deductibles) in order to avoid the tax.
Most serious is the underlying cost of care. As Gerard Anderson and his colleagues noted more than a decade ago, “it’s the prices, stupid,” not how much health care Americans actually consume, that account for its extraordinarily high cost.3 When the article by Anderson and his colleagues appeared in 2003, the United States devoted 13% of GDP to health care; by 2013, health care accounted for 17.4% of GDP, nearly a one-third real growth. With outlandish health care costs, as reported by Steven Brill in America’s Bitter Pill, come inexorably rising insurance premiums.4 To be sure, insurers help drive these costs. But their misbehavior probably explains relatively little. For employers, the choice is stark: their employees need take-home pay and cannot, in a stagnant wage economy, absorb skyrocketing premiums. As employers try to shift premium costs to employees, especially for family coverage, their workers can bear only so much. So insurers and employer plans turn to other strategies, like shrinking coverage, hiking deductibles, tightening provider networks, increasing cost-sharing for the costliest covered services, imposing limits, and denying treatments. As a result, health insurance ceases to be what we all hope for—protection against exposure to high medical costs.
Is this strategy going to work? Those who advocate for “health care consumerism” argue that more “skin in the game” will yield affordable, higher-quality care. As an example of this dynamic, let’s look at flat-screen televisions. In 2002, an early 32-inch screen TV with a bulky external CPU cost $4,000. Today, a 48-inch top-of-the-line model is available for $500. We might point out the obvious: it took a full 12 years for the price of a flat-screen television to come down to a reasonable price. This would not be so terrible if one were buying only a television set, but it is an unconscionable amount of time if the “product” to be purchased is necessary health care. But as the Nobel Prize winner Kenneth Arrow pointed out more than 50 years ago, health care really bears no resemblance to televisions,5 and patients are seldom “educated consumers,” given the complexity of medical treatment.
As health insurance coverage continues to be degraded, the real question is, how much longer will many policymakers remain besotted with market rhetoric? How long will the “free marketers” continue to assert that having “skin in the game” in the form of high deductibles and arbitrary coverage limits will somehow solve the underlying health care cost crisis?
Author(s): Sara Rosenbaum
Read on Wiley Online Library
Volume 93, Issue 3 (pages 463–466) DOI: 10.1111/1468-0009.12131 Published in 2015
Sara Rosenbaum is the Harold and Jane Hirsh Professor of Health Law and Policy and founding chair of the Department of Health Policy at the George Washington University School of Public Health and Health Services. She also holds professorships in the Schools of Law and Medicine and Health Sciences. A graduate of Wesleyan University and Boston University Law School, Rosenbaum has devoted her career to issues of health justice for populations who are medically underserved as a result of race, poverty, disability, or cultural exclusion. Between 1993 and 1994, Rosenbaum worked for President Clinton, where she directed the drafting of the Health Security Act and designed the Vaccines for Children program, which today provides near-universal immunization coverage to low-income and medically underserved children. Rosenbaum is the leading author of Law and the American Health Care System (Foundation Press, 2012) and has received many national awards for her work in public health policy. She is past chair of AcademyHealth and a member of the Institute of Medicine. Rosenbaum also has served on the CDC Director’s Advisory Committee and as a Commissioner on the Congressional Medicaid and CHIP Payment and Access Commission (MACPAC), which she chaired from January 2016 through the expiration of her term in April, 2017.
Get the Latest from the Milbank Memorial Fund
The Milbank Quarterly’s multidisciplinary approach and commitment to applying the best empirical research to practical policymaking offers in-depth assessments of the social, economic, political, historical, legal, and ethical dimensions of health and health care policy.