It has become an increasingly common story: A dollar store opens up in an economically depressed area with scarce healthy and affordable food options, sometimes with the help of local tax incentives. It advertises hard-to-beat low prices but it offers little in terms of fresh produce and nutritious items—further trapping residents in a cycle of poverty and ill-health.
A recent research brief by the Institute of Local Self Reliance (ILSR), a nonprofit supporting local economies, sheds light on the massive growth of this budget enterprise. Since 2001, outlets of Dollar General and Dollar Tree (which bought Family Dollar in 2015) have grown from 20,000 to 30,000 in number. Though these “small-box” retailers carry only a limited stock of prepared foods, they’re now feeding more people than grocery chains like Whole Foods, which has around 400-plus outlets in the country. In fact, the number of dollar-store outlets nationwide exceeds that of Walmart and McDonalds put together—and they’re still growing at a breakneck pace. That, ILSR says, is bad news.
“While dollar stores sometimes fill a need in cash-strapped communities, growing evidence suggests these stores are not merely a byproduct of economic distress,” the authors of the brief write. “They’re a cause of it.”
Dollar stores have succeeded in part by capitalizing on a series of powerful economic and social forces—white flight, the recent recession, the so-called “retail apocalypse”—all of which have opened up gaping holes in food access. But while dollar store might not be causing these inequalities per se, they appear to be perpetuating them. The savings they claim to offer shoppers in the communities they move to makes them, in some ways, a little poorer.
Using code made public by Jerry Shannon, a geographer at University of Georgia, CityLab made a map showing the spread of dollar stores since the recession.
As Lawrence Brown, a community health expert at Baltimore’s Morgan State University, tweeted in response to the ILSR report, dollar stores function as “subprime groceries.” And recently some local governments have started pushing back on these retailers, rejecting development at the neighborhood level or devising ordinances that seek to limit their spread in certain areas.
Such moves can be divisive—detractors point to the dire need such stores are meeting in retail-starved areas. But the rise of dollar stores represents a deeper problem, one rooted in the history of housing segregation. Addressing that issue requires questioning the host of complicated assumptions that have led to the present conditions—and the myriad ways residents in so-called food deserts have responded to them.
The “food desert” paradox
Ashanté Reese, an assistant professor at Spelman College, lives on Atlanta’s Westside, within two miles of a pair of dollar stores. Her zip code was particularly hard hit in the recession, suffering a 50 percent foreclosure rate. Those demographics are now changing, but the residents for a long time included elderly folks and people on fixed incomes—the exact kind of shoppers dollar-store executives have said they are targeting.
There’s also a traditional supermarket, a Kroger, which is where Reese shops. But the one near her house isn’t as nice as the one 15 minutes away, she says. The one in a whiter, more affluent neighborhood regularly advertises grains, nuts, seafood, olives, and wine.
“There are these tropes that are perpetuating in the shopping experience,” said Reese, who is also the author of a forthcoming book called Black Food Geographies: Race, Self-Reliance, and Food Access in Washington D.C.
While her neighborhood may have some alternatives, the presence of dollar stores in neighborhoods that don’t creates a Catch-22. On one hand, these chains are serving communities that others have neglected or abandoned—a phenomenon researchers have termed “supermarket redlining.” And when a segregated neighborhood loses a supermarket, the effects on residents in the immediate area can have effects on physical and mental health—it affects the self-worth of community. Having an affordable option for buying food in the vicinity—even if it’s not ideal—may be seen by residents as better than nothing. “As someone on a fixed income, I see [dollar stores] as saving the poor,” one Twitter user said, responding to the ILSR brief. “I can stock up on staples there a whole lot cheaper than at regular grocery stores.”
On the other hand, the absence of traditional grocers, and the presence of dollar stores, is deeply entwined with the history of spatial and structural inequality in America. “Supermarkets follow the patterns of racial and residential segregation—we can map this in any of the cities that have a solid black population,” said Reese.
In her research, she traces the decline of the supermarket in communities of color—specifically black communities—to the late-1960s, when unrest broke out in several major cities following Martin Luther King Jr.’s assassination. As white flight to the suburbs accelerated, urban supermarkets closed, citing security and financial reasons.
“Whether intentional or not, they were following white people out of the city,” she said. In Washington, D.C., where Reese did her field research, she counted 91 supermarkets in 1968; by 1995, just 33 remained. “We don’t see a reverse of that until now,” said Reese. Today, economically booming D.C. has many supermarkets, but they’re not evenly distributed across the city. In Ward 7 and 8 in Northeast D.C., for example, only three grocery stores serve about 150,000 residents. (Recently, the ride-hailing company Lyft and local nonprofit Martha’s Table have partnered up to provide supermarket rides to residents of these neighborhoods.)
Enter the dollar stores.
Like Walmart before them, these retailers present themselves as creators of jobs and sources of low-cost goods and food in “left-behind”areas—both urban and rural. The 2008 recession bolstered their numbers, simultaneously restricting the resurgence of traditional grocery stores and swelling the potential customer base. Middle-class shoppers started frequenting these stores. In 2009, the New York Times picked up on the trend: “Those once-dowdy chains that lured shoppers by selling some or all of their merchandise for $1 are suddenly hot.”
As the retail meltdown continues, in which many higher-end retailers in malls and shopping centers shutter or consolidate, compact low-budget dollar stores have easily slipped into the vacant spaces left behind.
“[Dollar stores] have thrived in the last decade because of growing inequality that’s a byproduct of power being concentrated in the hands of a small number of wealthy entities,”said Marie Donahue, the author of the ILSR brief.
Today, dollar stores are thriving both in the poorest of small rural towns, where environmental changes or globalization have wiped out economic activity, and larger cities like Baltimore, where decades of disinvestment in largely African American communities have left vast tracts barren of retail options. In a recent blog post tracking their rise in low-income parts of Baltimore, planner and architect Klaus Philipsen observes that dollar stores are now “flourishing in many poorer neighborhoods like a parasite.”
As CityLab’s Richard Florida has previously noted, there’s a particularly dense “dollar store belt” running from Ohio and Indiana in the north, via Kentucky and Tennessee, to the Gulf Coast. Here’s CityLab’s map showing how they mushroomed across the South, again, using Shannon’s code:
Shannon also created a county map of dollar stores per capita since 2008:
— Jerry Shannon (@jerry_shannon) December 20, 2018
The problem is not just the stores themselves. According to the ILSR, they tend to create fewer jobs on average than independent groceries—9 versus 14. The low-wage jobs they do create aren’t of great quality. And it’s not entirely clear if their offerings are that much more affordable either. When economists compared the price of goods like flour and raisins of the same weight, they noticed that dollar store products were higher cost than those at the nearby Walmart or Costco.
Then there’s their negative effect on others stores nearby. When a dollar store opened up in Haven, Kansas—subsidized through tax breaks by the local government—sales at the the nearby Foodliner grocery store dropped by 30 percent, The Guardian reported earlier this year. While the ILSR doesn’t have quantitative data supporting this effect on supermarkets in the vicinity, anecdotally, they surmise that “the difference in margins is just enough that the local stores are not able to stay in business when there are so few options and there is an undercutting of prices,” Donahue said.
The pushback against dollar store clusters
In Chester, Vermont, for example, residents argued in 2012 that allowing dollar stores to come to town “will be the beginning of the end for what might best be described as Chester’s Vermontiness,” per the New York Times—a statement that itself perhaps signals the class and race associations dollar stores have come to embody. In Buhler, Kansas, the mayor saw what happened to surrounding grocery stores in neighboring Haven and rejected the dollar store chain, also citing a threat to the town’s character.
“It was about retaining the soul of the community,” he told The Guardian. “It was about, what kind of town do we want?”
More recent efforts have used zoning tweaks to limit dollar stores, whose small footprint usually lets them breeze past restrictions big-box stores cannot. In Mendocino County, California, dollar store foes passed legislation restricting chain store development writ large. And in April, the Tulsa City Council passed an ordinance that requires dollars stores to be built at least one mile away from each other in North Tulsa. It also tacks on incentives for healthy grocers and supermarkets providing healthy food to locate in that area. “I don’t think it’s an accident they proliferate in low socio-economic and African American communities,” Vanessa Hall-Harper, a city councillor who grew up in North Tulsa and shepherded the ordinance, told ILSR. Since then, Mesquite, Texas, has followed suit with a similar move.
“Communities are standing up and raising red flags—saying it may be good in the short term, but perhaps not in the long term,” Donahue said. ”[They’re] saying, ‘Hey, is this is the only option, or can we think more creatively?’”
To find those better options, Reese has some advice for policymakers and city leaders looking to balance economic development and constituents’ needs: Try to really understand the neighborhood and the people who live in it.
“What if we went to these neighborhoods and didn’t assume that poor people or communities of color do not want to eat healthy?” she said. “I think it opens up a whole world of possibility—for us as researchers but also as advocates.”