As Mitt Romney flirted with the idea of a third presidential run in January, the former Massachusetts governor called for a new war on poverty in America. Romney’s remarks, which briefly got both parties talking about the issue, were surprising not only because he had drawn flak during his 2012 campaign for claiming that he was “not concerned about the very poor,” but also because American political discourse has always focused more on the frustrations of the middle class than the struggles of the least fortunate.
One reason politicians target their appeals to people in the middle of the socioeconomic scale is pragmatic: They are more likely to vote than those at the bottom. But it’s also because poverty is a particularly intractable and confounding problem. As a culture, we’re not sure how to explain who ends up in poverty—whether they’re disadvantaged by the system, lazy, or just unlucky. In fact, we can’t even agree on what poverty means.
The official poverty line has long been criticized as too simplistic, and in 2011, the US Census Bureau came out with a new way to measure poverty, which takes into account certain expenses and government aid. But far from settling the matter, it has thrown fuel on a surprisingly active debate about how we should define poverty. Many economists see the new measure as an improvement, but it has also proven controversial because it reveals a different face of poverty—one that is older and whiter. Meanwhile, some critics argue that we shouldn’t measure poverty with income at all. Limited consumption, they say, is a better sign that someone is poor. The whole thing may sound overly technical, but although the debate is taking place in white papers and academic journals, people on both sides say it has crucial, real-world implications.
“The poverty measure is how, in a rich nation, we know who is being left behind,” says Arloc Sherman, a senior researcher with the Center on Budget and Policy Priorities, a Washington think tank.
By global standards, of course, this whole country is well off. Still, within it, inequality is hugely apparent, and most Americans would agree that some of us need a helping hand. So how should we draw the line? Whose financial position we find to be the most difficult has implications for how we allocate resources, how we judge whether government programs are working, and how people up and down the economic ladder think about their position in society. And how we answer it, year to year, says a lot about our social priorities—and what we believe is necessary for a decent life.
In 2013, the official cutoff for poverty in the United States was an annual income below about $11,888 for a single person or $23,834 for a family of four. That year, the most recent for which data are available, the Census Bureau used these thresholds to calculate that 45 million Americans, or 14.5 percent of us, were poor.
The official poverty line, which is used to set eligibility for certain aid programs and to tell the government how the least fortunate are faring, was developed in 1963 by a Social Security Administration economist named Mollie Orshansky as a statistical tool to compare different demographic groups. She had not set out to create an all-purpose measure of deprivation, but her work coincided with the beginning of President Lyndon Johnson’s “war on poverty,” and the administration quickly seized upon her thresholds as the best working definition for the enemy.
Despite being updated annually to account for inflation, the poverty line is a relic of its era. Orshansky, who warned that her system would be useless over time, set income thresholds at about three times the cost of a very simple diet. That made sense in the 1960s, when the typical household spent a third of its budget on food. But today food makes up only about 6 percent of our expenses; we spend much more on housing and transportation. (A fifth of Massachusetts working households spend more than half their income on housing, a figure that is close to the US average, according to the Washington-based Center for Housing Policy.) Meanwhile, other things, like cellphones and health insurance, have been added to the average person’s list of bare necessities.
“It’s not a lot of money,” says Timothy Smeeding, an economist and former director of the University of Wisconsin’s Institute for Research on Poverty. “If you think of what it costs to give a kid a fair chance, you’ve got to do better than the poverty line.”
“There are more two-earner families, people driving to work, the cost of commuting has gone up, the cost of health care has gone up,” says Kathleen Short, the Census Bureau economist who led the development of the new measure, explaining that it takes into account how life has changed both above and below the poverty line. “It’s just more realistic and up to date.”
Compared with the official measure, the 2013 supplemental poverty rate was a percentage point higher, at 15.5 percent, meaning that 3.4 million more people were counted as poor. And they were demographically different people. When you count poverty under the supplemental measure, fewer children are ranked as poor—child poverty goes down by 20 percent compared to the official rate. While child poverty is still higher by both measures, more elderly people are poor by the supplemental one; for people over 65 poverty was more than 50 percent higher than the official rate. The supplemental measure showed higher poverty for whites, Hispanics, Asians, and people living in the suburbs, and lower poverty for African-Americans, the disabled, and people living in rural areas. There was slightly more poverty in the Northeast and West than under the official measure, but slightly less in the South and Midwest. The supplemental measure is not intended to replace the official one, but if it did, it might change who we think of as poor in this country.
Smeeding says the supplemental measure reveals something else, and that is the sometimes overwhelming cost of health care, especially for the elderly. Despite Medicare and Medicaid, older people are confronted with high out-of-pocket medical expenses that could limit their ability to pay for essentials: food, shelter, clothing, and utilities.
“Of course, they will pay for health care out of savings,” Smeeding says. “But you should subtract it from income, because you can’t eat wheelchairs.”
The supplemental and official measures also differ in how they track poverty over time. Conservatives like to point out that the official rate barely budged in the face of the war on poverty, despite, as the Heritage Foundation calculated in a recent report, more than $22 trillion spent on antipoverty programs since the Johnson administration. The new supplemental measure, on the other hand, offers good news for those who support welfare spending. Projecting it backward shows that poverty has fallen, in large part due to programs like Social Security and the Earned Income Tax Credit.
“The poverty problem has gotten harder, but our programs have gotten more effective,” Smeeding says. “You learn this program works, that program works. It took us a long time to get the [Supplemental Poverty Measure] to where it is today. I think it’s the best measure—although you can argue about whether it’s the best measure until you’re blue in the face.”
University of Chicago economist Bruce D. Meyer is happy to argue. In a 2012 paper in the Journal of Economic Perspectives, he and James X. Sullivan, an economist at Notre Dame, fault the new supplemental measure for adding too many people to the poverty rolls who census data suggest are relatively well-off, with, for example, larger homes stocked with the full complement of appliances. “Think about someone who’s retired and they’re living off their savings, they own their house,” Meyer says. “They don’t have any income, but it’s people’s standards of living that we care about. The cars people drive, the apartments they live in—that’s really the bottom line.”
Meyer and Sullivan are among a handful of prominent economists arguing that we need to look at how much people spend, rather than how much they earn, in deciding whether they are poor. This would knock many older people off the poverty rolls while adding children, and if used as a guide for aid could mean more money for families. But it would be politically risky, because it might take a bite out of programs to help the elderly, who are famous for turning out to vote.
“It depends on who you’re talking to, whether it’s the Children’s Defense Fund or the AARP,” Meyer says. “Poverty measures are very political. It’s not just a question of how do you come up with the best yardstick. People are concerned about what its implications will be for various programs and how people think about poverty in general.”
If Meyer and Sullivan’s proposal might be a hard sell politically, it would be a step closer to an idea that is becoming better established scientifically and in other countries. As many have observed, poverty is not simply a shortage of resources, but a multidimensional state that can affect many aspects of life, from health to aspirations for children.
“There’s more to poverty than money,” says Rod Hick, a social policy researcher at Cardiff University in the United Kingdom, where there is a long tradition of tracking poverty with an index of deprivation indicators, like whether a family can afford to go on vacation, or whether its members’ shoes are in good condition.
“I would argue that having a greater number of metrics tells you different things about the problem of poverty, partly because some of the measures sometimes move in surprising directions,” Hick says. He notes, for example, that a threshold tied to the median income could show poverty falling during a recession as unemployment rises, while deprivation indicators may reveal that people are actually doing worse.
This approach might be more in line with how the poor themselves think about poverty. For a new study in the journal Children & Society, Oxford University research fellow Rys Farthing talked to teens in down-and-out urban neighborhoods about what it means to be poor. The kids were less concerned about having enough money than about unsafe neighborhoods, a shortage of opportunities, and the nagging feeling that they were shut out of the broader community.
“Working with people directly, the shame and stigma is as bad as the deprivation,” Farthing says. “The poverty line doesn’t capture everything that comes with being poor. I think if we could start from scratch looking at poverty, it’s a no-brainer. Of course we’d use a multidimensional measure.”
Still, in the real world, Farthing sees value in the simplicity of existing income-based poverty lines. “Any change is so politicized,” she says. “Once the conversation starts, you find out fairly quickly that this is not a scientific question. If we set a poverty threshold at a level that we say provides a decent life, you’ll have people saying, ‘That’s not enough for a decent life.’ And you’ll have other people saying, ‘Actually, that’s living high.’”
There’s another reason why change might be slow to come: If defining poverty is a political question, it’s a moral one too. Here in one of the wealthiest countries in the world, where we value self-sufficiency and believe all hard workers should be able to get ahead, how low are we willing to let our neighbors’ standard of living fall before we determine that they need help? An answer that everyone can agree on might be beyond even the most sophisticated measure of poverty.
“It can be appealing to think that there is one measure which neatly divides the poor from the nonpoor, and that once we have this perfect measure, it’s all we need,” Hick says. “Actually, the crux of poverty analysis is about trying to understand some of the gray.”
Amy Crawford is a writer in Massachusetts. Follow her on Twitter @amymcrawf.