Skip to main content

The Changing American Neighborhood: 5. The Polarization of the American Neighborhood, 1990–2020

The Changing American Neighborhood
5. The Polarization of the American Neighborhood, 1990–2020
  • Show the following:

    Annotations
    Resources
  • Adjust appearance:

    Font
    Font style
    Color Scheme
    Light
    Dark
    Annotation contrast
    Low
    High
    Margins
  • Search within:
    • Notifications
    • Privacy
  • Project HomeThe Changing American Neighborhood
  • Projects
  • Learn more about Manifold

Notes

table of contents
  1. Preface
  2. Acknowledgments
  3. Introduction
  4. 1. Why Good Neighborhoods?
  5. 2. A Dynamic Systems Approach to Understanding Neighborhood Change
  6. 3. The Rise of the American Urban Neighborhood, 1860–1950
  7. 4. The American Urban Neighborhood under Siege, 1950–1990
  8. 5. The Polarization of the American Neighborhood, 1990–2020
  9. 6. Neighborhoods as Markets
  10. 7. Neighborhoods in an Era of Demographic Change and Economic Restructuring
  11. 8. The Continuing yet Changing Significance of Race
  12. 9. Agents of Neighborhood Change
  13. 10. Deconstructing Gentrification
  14. 11. The Crisis of the Urban Middle Neighborhood
  15. 12. The Persistence of Concentrated Poverty Neighborhoods
  16. 13. Neighborhood Change in the Suburbs
  17. 14. The Theory and Practice of Neighborhood Change
  18. Notes
  19. Index

5 THE POLARIZATION OF THE AMERICAN NEIGHBORHOOD, 1990–2020

As with any far-reaching economic or demographic change, no one can pin down exactly when and where it began, but at some point during the 1990s, something important started to change in American cities. Despite having been all but given up for dead since the 1960s and 1970s, cities came back to life, a couple at first, a few more after the arrival of the new millennium, and still more after the end of the Great Recession. Without understanding that change, it would be impossible to understand the state of neighborhoods in America today.1 While we will drill down into many of the subjects covered in this chapter later, this chapter will provide an overview and context for much of the rest of this book.

By the 1990s, the United States was looking very different from what it had looked like in the 1950s. Twenty-five years after the immigration reforms of 1965, the wave of immigration that followed was making the United States an increasingly diverse country, with large and growing Asian and Latinx populations. New immigrant neighborhoods of Bangladeshis in Detroit, Salvadorans in Los Angeles, and Cambodians in Philadelphia had begun to emerge.

Birth rates had declined, and families were smaller. In the 1950s America was a land of young, married, child-raising couples. Over the following decades as the share of families raising children dropped, the number of single people, both young and old, grew. There were fewer children and more older people. The nation’s economic distribution shifted, leading to what has been called the hollowing out of the middle class. The young married couples of the 1950s were part of a large middle class that encompassed not only a growing professional and managerial class but also most of the large unionized blue-collar industrial workforce. By the 1990s that middle class was shrinking, a decline that has continued to this day. Both of these shifts have had powerful implications for American neighborhoods, driving revival in some but undermining long-standing vitality in others.

Although racism remained a powerful barrier, African Americans were far more dispersed across cities and metropolitan areas than before. While disproportionate numbers still lived in segregated areas of concentrated poverty, those depopulated areas were a far cry from the once-vibrant ghettos such as Chicago’s Bronzeville and Detroit’s Black Bottom. More African Americans lived in the suburbs, and racially mixed neighborhoods were no longer the rarities they had been in the 1960s and 1970s.

The economic underpinnings of America’s cities were radically different from those of the 1950s. While Americans still made things, they made fewer and fewer things in the nation’s older cities. The industrial behemoths that had dominated cities such as Pittsburgh and Cleveland, driving their economies and employing most of their residents, were gone, leaving vacant hulks to remind people of what once had been. By the end of the century a new economy, variously described as a knowledge economy, a creative economy, or a service economy, rooted in the growth of universities and medical centers, the so-called eds and meds sector, had begun to rise in its place. By the mid-2010s, Pittsburgh boasted more jobs in education and health care than it had in factory jobs fifty years earlier. The well-paying working-class jobs that were lost when the factories closed, though, were gone forever.

The growth of the new urban economy was paralleled by another shift that has been dubbed the “march of the millennials.”2 In a trend that began in the 1990s but accelerated after the start of the millennium, young people began to flock to the nation’s older cities, not all young people but specifically members of the one-third of their generation with four-year college degrees. While during the 1990s they moved mainly to coastal cities such as Boston, New York, San Francisco, and Washington, D.C., after 2000 many started to move to formerly industrial cities of the East and the Midwest such as Baltimore, Pittsburgh, and St. Louis, homes to major universities and research-oriented medical centers. In Philadelphia, the number of people ages twenty-five to thirty-four with a BA or higher degree more than doubled between 2000 and 2016, reversing that city’s long-term population decline. By 2019, nearly one out of four Washington, D.C., adult residents were young people between the ages of twenty-five and thirty-four with a BA or higher degree.

While much of the millennial migration was driven by people seeking the diverse mixed-use environments that were emerging in downtowns and in gentrifying neighborhoods, two other factors were critical. First, the growth of the eds and meds economy and its spin-offs meant that growing numbers of suitable jobs for college graduates were available in the same cities that offered the settings they sought; and second, the decline in urban crime and violence that began in the 1990s increased the appeal of urban areas as a place to live and even as a place to raise a family for a growing number of households. Washington, D.C., went from being the so-called murder capital of the United States in the early 1990s to a glittering playground for millennials in barely two decades.

While the much-celebrated “back to the city” movement was real, it was limited. Many young well-educated people were moving to the cities, but most others were not. Outside a small number of magnet cities such as San Francisco and Washington, D.C., the effects of the new in-migration tended to be felt only in small parts of the city, mostly in downtowns and a few adjacent neighborhoods, along with neighborhoods bordering amenity-rich areas such as Baltimore’s Inner Harbor or near the city’s major universities and medical centers. Urban downtowns, where few if any people had lived since the late nineteenth century, were becoming upscale residential neighborhoods, epicenters of the youthful migration to the cities. From downtown Detroit to Chicago’s Loop, developers were converting empty 1920s office buildings into apartments, along with new restaurants, fitness centers, and other facilities catering to the affluent young.

Not only were many other parts of the same cities seeing little or no revival, but many urban neighborhoods were actually declining, often more so than during preceding decades. The cities were becoming increasingly polarized places, and the onetime working-class and middle-class neighborhoods of earlier years, many of which had survived decades of urban decline, were increasingly endangered. African American communities were most at risk, as many once-solid Black working- and middle-class neighborhoods stumbled, undermined by the aging of the generation that created those neighborhoods in the 1960s and 1970s, the wave of subprime mortgages and foreclosures, and the decision of growing numbers of young Black families to seek suburban alternatives.

The urban revival was often exaggerated and overhyped. Contrary to some urbanists’ predictions, the suburbs showed no signs of withering away. Their fundamental fallacy was to see the relationship between urban revival and suburban growth as a zero-sum proposition rather than recognize that two distinct processes were taking place side by side. Even as the cities drew new affluent residents and billions in investment, suburbs continued to grow, sprawling farther and farther away from the center.

Although almost every age group other than the millennial generation was still showing net outward movement from central cities, suburban sprawl was furthered less by direct urban out-migration and more by outward movement from within the suburban ring. Cleveland State’s Tom Bier, in his research on home buyers in northeastern Ohio in the 1980s and 1990, was struck by “the extent of movement up in price to a newer, larger house, and the extent of movement outward by suburban residents, and not just outward but from the suburb where they had been living.”3

The upward and outward dynamic of intersuburban migration led to shifts that paralleled those taking place in the central cities: a growing divergence of rich and poor places and a hollowing out of the middle. Many suburbs prospered, including many of the original nineteenth-century railroad suburbs, with their commuter trains to the central city and their compact, walkable downtowns. New Jersey railroad suburbs of New York City such as Montclair and Morristown drew affluent in-migrants as upscale restaurants and specialty stores opened in their downtowns and modest one- and two-story structures were replaced by upscale apartment buildings.

Other suburbs, however, were showing strain, particularly many of those built in the first wave of suburbanization after World War II. By the mid-1990s, young families who had settled in inner-ring suburbs with such exuberance in the 1940s and 1950s had either moved on to bigger houses elsewhere or begun to succumb to the challenges of age. Their homes were showing their age too and were far less appealing to new middle-class families than they had once been to their parents or grandparents.

Many inner-ring suburbs of America’s reviving cities began to turn into what might be called a secondary poverty belt, such as north St. Louis County outside the city of St. Louis and southern Cook County outside Chicago, a subject we discuss in further detail in chapter 13. Park Forest, a Cook County suburb profiled in William H. Whyte’s classic The Organization Man as the archetypal dynamic neighborhood-focused middle-class community of the 1950s, is struggling today.4 As the town’s mayor observed in 2014, “few would have envisioned the day in Park Forest when one of its churches would operate a food pantry that feeds approximately 350 local families per week[,] … and who would have believed that one day it would be necessary for Habitat for Humanity to renovate Park Forest homes that had gone into foreclosure?”5

The middle has not disappeared. America’s suburbs still contain many good yet unpretentious neighborhoods occupied by people with middling incomes, but they are fewer, and many are vulnerable to a future recession, just as many proved vulnerable during the foreclosure crisis and recession of 2007–2009. We will discuss those events, which were a watershed for hundreds of urban and suburban neighborhoods, later in this chapter.

The early suburbs contained fairly coherent neighborhoods. Indeed, a 1978 study found that urban and suburban residents held remarkably consistent perceptions of their neighborhoods as existing within defined spatial bounds.6 One wonders, however, whether the same can be said for many of the sprawling exurban McMansion suburbs that have proliferated in the last few decades. This is not a trivial question. As we discussed earlier, areas in which the population density is so low as to all but preclude any significant informal interaction between neighbors might not be considered “neighborhoods.” But given the pervasive nature of neighboring behavior, it is certainly possible that alternative forms of what one might call “quasi-neighboring” or less spatially defined behavior, perhaps organized around schools or local government issues or even through social media, may play an important social role in these communities. This would be a worthy study for some enterprising future ethnologist.

The partial reurbanization of America and the simultaneous revival of the cities and expansion of the suburbs triggered levels of neighborhood volatility not seen since the explosive “urban crisis” years of the 1960s and 1970s. Across urban America, neighborhoods were changing, often seemingly overnight. That process came to be symbolized by the term “gentrification,” which since 2000 has become a widely used and equally widely contested shorthand for neighborhood change. Ironically, as we will discuss later, in many, perhaps most, American cities gentrification remains a relatively rare phenomenon compared to neighborhood decline and the relative immobility of concentrated poverty neighborhoods. Although perhaps seen as almost a synonym for neighborhood change, in reality it is but one of many forms that neighborhood change takes.

Patterns of Variation

The rise of the gentrification discourse raises an important issue. In the years preceding the 1930s Great Depression, nearly all cities were growing. From the 1950s through the 1980s, conversely, nearly all cities outside the Sunbelt were declining. Since the 1990s, though, different cities and their neighborhoods have followed wildly different trajectories. Since neighborhood trajectories are strongly driven by larger citywide and regional forces, it is worth spending some time talking about the divergent trends driving American cities.

In the 1960s and 1970s the “urban crisis” was an all but universal reality in the nation’s older cities. When Jeanne Lowe wrote Cities in a Race for Time in 1967, she looked at five cities as paradigms of distress: New York, Philadelphia, New Haven, Pittsburgh, and Washington, D.C.7 Today, Washington and New York are appropriately seen as belonging to a separate class of thriving magnet cities, while Pittsburgh has recently been widely hailed as a model of urban revival. Boston and Washington, two cities that lost much of their population during the late twentieth century and were seen at the time as deeply distressed, are now growing at a rapid pace while drawing thousands of affluent newcomers. Similarly, it was in Seattle—today the archetype of the booming tech city—where the notorious “will the last person …” billboard went up in 1971 (figure 5.1)). Things have changed since then. Since 2010, Seattle’s population has been growing by about thirteen thousand people per year (figure 5.2).

Pittsburgh has come close to stabilizing its population but has yet to begin growing back, while the 2020 census found that Cleveland and St. Louis are still seeing sustained population loss. Simple statistics about population growth or decline, however, miss an important area of divergence among American cities.

A different lens through which the divergent trends of American cities and their neighborhoods can be seen is that of the “influx of wealth,” or the in- or out-migration of more versus fewer affluent households. When we look at America’s cities through this lens, we find that most fall into four distinct types, as shown in figure 5.3. For purposes of this exercise, we have defined “wealth” and “upper income” broadly as those households earning at least 125% of the national median income, or roughly $72,000 per year or more in 2016. Key features of the four city types can be summarized as follows:

  • Sunbelt cities, such as Phoenix, Dallas, Fresno, and Las Vegas, are seeing net in-migration across all income levels, but most in-migrants are lower- and middle-income households, reflecting both the generally lower wage levels of these cities’ job base and the relative affordability of their housing stock. Between 2000 and 2016, Dallas added 55,000 low to middle-income households but fewer than 6,000 upper-income households.
  • Magnet cities, such as Seattle, San Francisco, Boston, and Washington, D.C., are seeing large-scale in-migration of upper-income households coupled with smaller net out-migration of low- and middle-income households, reflecting the rapid growth of well-paying jobs in those cities as well as the high cost of housing, making it increasingly difficult for lower-income households to remain in the city. From 2000 to 2016, Washington, D.C., added 47,500 affluent households and Seattle nearly 64,000.
  • Reviving legacy cities, such as Baltimore, Pittsburgh, and Philadelphia, are seeing small net in-migration of upper-income households but equal or greater out-migration of low- and middle-income households. While the growth in affluent households reflects the strength of the knowledge economy in these cities along with a few areas that offer the amenities attractive to young professionals, the outflow of less affluent households is largely a response to continued neighborhood decline rather than price pressures or gentrification and reflects the availability of affordable suburban alternatives.
  • Struggling legacy cities, including Detroit, Cleveland, and smaller cities such as Akron and Dayton, have a net outflow of households at all income levels, reflecting the continued economic weakness of these cities as well as the many affordable suburban alternatives. Between 2000 and 2016 Detroit lost 48,500 affluent households, over half of all affluent households in the city in 2000. The rate of decline in Detroit’s upper-income population has slowed more recently, suggesting that a reversal of that decline might be in the offing.
Figure 5.1: A photograph of a billboard erected in Seattle in 1971 with the legend “Will the last person leaving Seattle—turn out the lights.”

FIGURE 5.1.    Seattle then and now. The famous Seattle “Turn out the lights” sign, 1971.

(Photograph courtesy of Greg Gilbert/The Seattle Times)

Figure 5.2: A photograph of construction cranes and glossy high-rise towers, a typical scene in downtown Seattle in the 2010s.

FIGURE 5.2.    Seattle today.

(Photograph by Google Earth © Google 2022)

Figure 5.3: A graph comparing in-migration and out-migration to selected American cities from 2000 to 2016 by upper-income and low- and middle-income households, showing the contrasting migration patterns in different types of city.

FIGURE 5.3.    Who’s moving in, who’s leaving: Net household change by income in American cities, 2000 to 2016

(Authors’ work based on decennial census and American Community Survey data)

These trends powerfully affect neighborhood trajectories. While gentrification in its most common meaning of more affluent households moving into lower-income neighborhoods is a limited phenomenon in many Sunbelt cities, the steady influx of middle-income families fosters greater neighborhood stability than in many cities elsewhere in the United States. In contrast to older cities, though, where neighborhood identities were often fostered by resident identity or traditional boundaries, Sunbelt neighborhoods are often the product of developers’ decisions and seemingly arbitrary subdivision boundaries but for all that are no less real. Where subdivisions are bordered by open-space areas—nonresidential areas such as shopping malls or industrial parks—or arterial roads, those features often create clearly defined neighborhood boundaries tending to promote neighborhood identification, as can be seen in figure 5.4 from Garland, Texas, a suburb of Dallas. The City of Garland actively promotes neighborhood identity and engagement through many programs and initiatives.8

In magnet cities such as Seattle and Washington, D.C., however, gentrification is a powerful driver of neighborhood change. The sheer number of affluent households seeking to live in these cities and the inability of developers to keep up with demand mean that large parts of the city are under intense demand pressure. Rising house prices, which spill over even to distressed areas where few affluent in-migrants are moving, put increasing pressure on lower-income households throughout the city.

By contrast, gentrification in cities such as Baltimore and St. Louis affects only a handful of the city’s neighborhoods. In Baltimore those neighborhoods cluster in the shadow of downtown, around Johns Hopkins University, or near the city’s Inner Harbor. Meanwhile, the decline of once-vital working-class neighborhoods and the stagnation of areas of concentrated poverty affect far more of the city and in their separate ways are far more destructive of their residents’ hopes of living in a good neighborhood than gentrification, which arguably provides some benefits to some if not all long-term residents.

The net out-migration of affluent households from Detroit and Cleveland does not mean that no gentrification is taking place in these cities; net out-migration, after all, does not mean that there is no in-migration, only that those leaving exceed those moving in. Detroit’s downtown is changing as 1920s office towers are repurposed and restored to become home to an affluent young population, as is Midtown around the Detroit Medical Center and Wayne State University, while more modest change is visible in a handful of small residential enclaves such as Corktown. But the influx has been modest, covering at most 5 percent of the city’s land area, compared to the continued outflow from the city’s other residential neighborhoods.

It is a safe bet that if one asked a typical educated layperson today what the major form of neighborhood change in American cities has been over the past couple of decades, the answer would almost certainly be “gentrification.” But that answer would be wrong. The reality was portrayed by a 2019 study of neighborhood change from 2000 to 2016 by a University of Minnesota research team. Some of their findings are worth highlighting:

  • The most common form of American neighborhood change, by far, is poverty concentration. About 36.5 million residents live in a tract that has undergone low-income concentration since 2000.
  • Low-income residents are invariably exposed to neighborhood decline more than gentrification. As of 2016, there was no metropolitan region in the nation where a low-income person was more likely to live in an economically expanding neighborhood than an economically declining neighborhood.
  • On net, far fewer low-income residents are affected by displacement than concentration. Since 2000, the low-income population of economically expanding areas has fallen by 464,000, while the low-income population of economically declining areas has grown by 5,369,000.
  • Nonwhite residents are far more likely to live in economically declining areas. In 2016, nearly 35 percent of black residents lived in economically declining areas, while 9 percent lived in economically expanding areas.9
Figure 5.4: An aerial photograph of a neighborhood in Garland, Texas, showing how its borders are defined by a park, a light industrial area, and an arterial road.

FIGURE 5.4.    Neighborhood borders in Garland, Texas

(Overlay by Bill Nelson on Google Earth base map © 2022 Google)

Joe Cortright of the City Observatory, who looked at the long-term trend from 1970 to 2010, wrote that “while media attention often focuses on those few places that are witnessing a transformation, there are two more potent and less mentioned storylines. The first is the persistence of chronic poverty.… The second is the spread of concentrated poverty: three times as many urban neighborhoods have poverty rates exceeding 30 percent as was true in 1970 and the number of poor people living in these neighborhoods has doubled.”10

What is going on? As Cortright suggests, two separate things. First, poor neighborhoods—those areas where poverty has been highly concentrated since the 1970s or 1980s—are stuck in place. They rarely move out of poverty, whether through gentrification, self-help, or any other mechanism. And the poorer they are, the less likely they are to see any change. Second, many neighborhoods that were not poor a few decades ago or even ten years ago have become poor neighborhoods.

St. Louis fits this picture closely. In 1970, only 16 out of 106 census tracts in St. Louis were concentrated poverty areas where 30 percent or more of their residents had incomes below the national poverty level. By 2017 there were 44 out of 106, almost three times as many. Some of this change took place in the 1970s and 1980s, years of devastating urban decline. But among low-poverty (under 20 percent poverty rate) neighborhoods in St. Louis in 1970 that became high-poverty (over 30 percent) neighborhoods by 2017, almost half (9 of 20) did not become high-poverty neighborhoods until after 2000, after the urban revival was well under way.

Even more distressing, urban middle-income neighborhoods that have become high-poverty neighborhoods since 2000 are disproportionately African American. Chicago’s South Shore is a case in point. While to the casual observer it still looks like a pleasant collection of classic Chicago bungalows and medium-size apartment buildings, by 2017 over one-third of the population lived below the poverty level, and 20 percent of the housing units were vacant. That story was repeated not only across Chicago but also in Detroit, Cleveland, and any of the other cities where African American families had moved into middle-class neighborhoods vacated by their white residents in the 1960s and 1970s, sustaining them over the following few decades.

Meanwhile, the few neighborhoods in these cities that gentrified were most often white or racially mixed urban working-class neighborhoods. Indeed, in a reality at odds with the popular narrative, the neighborhoods most likely to gentrify, particularly in cities where the influx of wealth is still modest, are neither Black nor severely distressed but instead are predominantly white or racially mixed, still intact working-class neighborhoods. We looked at the income trajectories of Chicago’s middle-income neighborhoods between 2000 and 2017 by their racial configuration (figure 5.5).11 While upward movement into a higher income range is not necessarily evidence of gentrification, it is a precondition of gentrification. The data is telling. Most predominantly white neighborhoods stayed the same or moved up, while most predominantly Black neighborhoods moved down. We will explore this further in chapter 10.

Figure 5.5: A graph comparing the economic trajectories of predominantly white, mixed, and predominantly Black neighborhoods in Chicago from 2000 to 2017, showing that while the economic condition of predominantly white neighborhoods tended to be stable or improving, most predominantly Black neighborhoods declined economically.

FIGURE 5.5.    Trajectories of middle-income neighborhoods in Chicago by racial configuration, 2000 to 2017

(Authors’ work based on decennial census and American Community Survey data)

The decline of so many urban neighborhoods since the 1990s reflects many different factors. Demographic trends were working against them. The houses in these neighborhoods were aging, as were many of their owners. The continuing fiscal problems of older cities were leading to either higher taxes, declining public services, or both. For younger would-be home buyers the suburbs beckoned, with their promise of newer houses, better schools, safer streets, and lower taxes.

But in addition to what may be called systemic problems, many urban neighborhoods were hit hard by the foreclosure crisis of 2006 and 2007 and the Great Recession that followed in its wake. We discuss that crisis and its effects below, along with some thoughts on the COVID-19 pandemic that was raging as this was being written, and whether it may have similarly destabilizing neighborhood effects.

The Foreclosure Crisis, the Great Recession, and the COVID-19 Pandemic

During the first years of the twenty-first century it appeared to many observers that housing markets in older cities had turned a critical corner. Millennials were moving into reviving neighborhoods and downtowns, more and more people were buying homes, and house prices, even in distressed neighborhoods, were rising to unprecedented levels. The trends in three New Jersey cities are shown in table 5.1. It was hard at first to understand what was going on. All three cities were and still are struggling. As racially and ethnically diverse, mostly working-class and lower-income cities, none were gentrifying by any reasonable definition at the time or since. Although all three were close to New York City, none was showing economic growth that might support house price increases of 15 percent to 20 percent per year. Observers could not understand what mysterious phenomenon had triggered this amazing change.

It turned out that it was not particularly mysterious. It was a classic real estate bubble, driven above all by the subprime lending boom that began in the late 1990s and blew up in the new millennium. Although the number of books that have been written on the subject would fill a large bookshelf, it is useful to summarize here.12 The problem began with what many in the lending industry saw as a reasonable, even creative idea. Mortgage lending had always been based on a simple “either you qualify or you don’t” model. There was one body of criteria, and if you met them you received what was sometimes called a “prime” mortgage. If you failed, you received no mortgage. As automated underwriting gradually replaced loan officers’ judgment during the 1990s, a new idea emerged. Rather than treat lending as a yes/no proposition and disqualifying subprime borrowers, why not provide those borrowers with mortgages but adjust the interest rate upward to reflect the higher risk they represented? That way, more people could become homeowners, and the increased business would bolster the financial sector.

In theory this might have worked if lenders had rigorously assessed risk and adjusted terms to reflect it properly. We say in theory because that assumes that the underwriting models were based on enough reliable data to match the increased risk of each subprime borrower with the right interest rate, a highly doubtful assumption. It was never really tried. Instead, at a time when vast amounts of money in the global financial markets were looking for profitable places to land, and with mortgage securitization passing the risks on to unsuspecting third parties, it turned into a lending frenzy best summed up in the famous line from one subprime lender: “If you had a pulse, we give you a loan. If you fog the mirror, we give you a loan.”13

Lenders came up with increasingly exotic mortgage products, such as the so-called NINJA (no income, no job, no assets, no problem) loans and loans with “teaser rates,” low discounted interest rates for the first two years, rapidly rising to untenable levels thereafter. With lenders indifferent to borrowers’ incomes, property values, or the relationship between the two and seemingly willing to refinance any loan, a bubble emerged. This description is an oversimplification but captures the essence of the story. The ensuing bubble caused prices to skyrocket in Newark, Elizabeth, and hundreds of other cities across the country.

The bubble burst starting around 2006. This led to the Great Recession, triggered by the meltdown in financial institutions including such storied names as Bear Stearns and Lehman Brothers. The results can be seen in the post-2006 trajectories of mortgage originations and sales prices in the same three cities, shown in figure 5.6 and 5.7. From 2006 to 2012, the number of home purchase mortgages in the three cities fell from nearly 6,600 to under 1,000. While mortgages have since recovered modestly, they are still less than one-third of what they were at the height of the bubble. House prices collapsed. By 2012, they were less than half of what they had been in 2006 and only slightly higher than in 2000, when the feeding frenzy began. Although they too have picked up slightly, as of 2020 they had not regained more than a small part of the postrecession loss.

Figure 5.6: A graph of home purchase mortgage originations in three New Jersey cities from 2000 to 2017, showing a rapid rise from 2000 to 2006 followed by an even more rapid decline to 2008 and little change since then.

FIGURE 5.6.    Home purchase mortgage originations in three New Jersey cities, 2000 to 2017

(Authors’ work based on Home Mortgage Disclosure Act data)

Figure 5.7: A graph of median house sales prices in three New Jersey cities from 2000 to 2017, showing a rapid rise from 2000 to 2006 followed by a rapid decline to 2010 and only slight increases since 2010.

FIGURE 5.7.    Median house sales prices in three New Jersey cities, 2000 to 2017

(Authors’ work based on New Jersey State Treasury and Boxwood Means/PolicyMap data)

What happened in Newark, Paterson, and Elizabeth happened across the United States. But not everywhere, of course. Booming cities such as Portland and Denver experienced little more than a hiccup before prices began to rebound, fueled by sustained economic growth and a homebuilding industry that consistently lagged behind demand. Today, the median house price in Portland is 50 percent above its bubble peak and in Denver more than double. House prices in affluent suburban areas around the country have not experienced comparable explosive growth but have generally recovered their losses. The same is largely true of many Sunbelt cities that were hit hard by the foreclosure crisis; the median Los Angeles house price hit $555,000 in 2006, dropped to $311,000 by 2011, rose to $644,000 in 2017, and was over $700,000 by 2020. Predictions that Las Vegas and Phoenix would turn into sunbaked counterparts of Detroit and Cleveland were soon proven wrong, monuments to the pitfalls of prediction and certain writers’ premature rush to judgment.

The collapse of the real estate bubble devastated millions of homeowners, who lost their homes and with them the greater part of what little wealth they possessed. It also destabilized thousands of neighborhoods and communities, which lost homeowners and saw their housing stock fall into the hands of speculators or into vacancy and abandonment. The protracted mortgage foreclosure process in many states led to many houses being abandoned by their owners well before the lender had taken title, making matters worse, as did the customary lender practice of evicting both owners and tenants from houses on which they had foreclosed, leaving them empty.

While the federal government created the Neighborhood Stabilization Program, which gave local governments and nonprofit organizations funds to acquire, rehabilitate, or demolish foreclosed properties, it was a piecemeal effort that received only $7 billion spread thinly across the country over the four years of the program. This was a trivial amount in a nation where millions of properties were going into foreclosure. To put it in perspective, between 2009 and 2012 private investors and speculators invested roughly $25 billion in buying foreclosed properties in the Las Vegas area alone, an area containing less than 1 percent of the nation’s population.14 Research on the Neighborhood Stabilization Program strongly suggests that whatever individual properties may have benefited from the program, it had no significant effect in changing the trajectory of neighborhoods destabilized by foreclosures.15

Neighborhood effects varied both in the extent to which house prices, and real estate activity generally, crashed when the bubble burst and in the degree of recovery since then. Two types of neighborhood stand out as being hardest hit in both respects, as can be seen in table 5.2, which shows price change from 2007 to 2017 for a cluster of urban neighborhoods and suburban municipalities in the Chicago metropolitan area.

The region’s stable, affluent suburbs west and north of the city and the city of Chicago on average saw significant decline after the onset of the crisis—typically 40 percent to 50 percent—but recovered strongly. By 2017, while prices were still below their bubble peaks, they were within 10–20 percent of those peaks. The city of Chicago trend, though, averages wildly divergent neighborhood outcomes.16

Upscale and gentrifying city neighborhoods saw less decline and much stronger recovery. Today, prices in both gentrifying Wicker Park and upscale Lincoln Park are well above their bubble peaks. Conversely, struggling middle neighborhoods on the city’s south side saw greater declines and less recovery. Today, prices in both Chatham and South Shore are far below their peaks. Both are predominantly Black middle neighborhoods.

The picture is similar in the struggling suburbs of southern Cook County, much of which were built during the 1950s and populated during the era of white flight. Not only did their prices collapse when the bubble burst, but they also have seen virtually no price recovery since hitting rock bottom around 2011. If anything, their current market conditions are even more dire than those of the city’s hard-hit middle neighborhoods.

This picture reflects two distinct but related patterns. The first is subprime lending. The targeting of communities of color by subprime lenders has been thoroughly documented.17 With homeownership rates much lower than in white communities, those neighborhoods offered lenders a large pool of potential buyers, many of whom were unsophisticated about homeownership and mortgages. Black communities were targeted through influential intermediaries. As one Wells Fargo loan officer testified, the bank “had an emerging-markets unit that specifically targeted Black churches, because it figured church leaders had a lot of influence and could convince congregants to take out subprime loans.”18 Figure 5.8 shows the relationship in 2005, at the height of the bubble, between the racial composition of census tracts in St. Louis and the percentage of high-cost home purchase mortgages, a proxy for subprime mortgages. Seventy-five percent of all mortgages made in the city’s predominantly Black middle census tracts that year were high-cost loans, compared to 29 percent in the rest of the city.

Figure 5.8: A graph of the percentage of high-cost (subprime) mortgages among mortgages made in St. Louis in 2005 by the percentage of Black population by census tract, showing how the percentage of high-cost mortgages increases with the Black population share of each tract.

FIGURE 5.8.    Race and subprime lending in St. Louis, 2005

(Authors’ work based on Home Mortgage Disclosure Act data and decennial census data)

The neighborhoods that became ground zero for subprime lending were not the most heavily disinvested ones but rather the ones that were one step up on the economic ladder: predominantly Black middle neighborhoods inside central cities and lower-priced inner-ring suburban communities. Those neighborhoods also had many older homeowners with equity in their homes who were convinced to refinance with subprime mortgages. These were often so-called cash-out refinancings whereby the new mortgage was substantially larger than the one being refinanced, providing the owner with a cash windfall. However valuable that may have been to many financially strapped owners, it wiped out their home equity, all but ensuring that when the bubble burst their position would be untenable.

Figure 5.9: A graph comparing house sales prices in four Chicago neighborhoods, two predominantly white and two predominantly Black, showing how prices fell more sharply from 2007 to 2011 in the two Black neighborhoods and have recovered less since then, compared to the white neighborhoods.

FIGURE 5.9.    Percent change in house sales prices in four Chicago neighborhoods, 2000 to 2017

(Authors’ work based on Zillow/PolicyMap data)

The result was rapid, unsustainable house price increases in those neighborhoods and subsequent collapse. Figure 5.9 compares the two Chicago Black middle neighborhood tracts with the two upscale/gentrifying neighborhood tracts from table 5.2 in terms of the percentage rise and fall in prices from 2000 to 2017. Prices rose faster in percentage terms in the middle neighborhoods but dropped much faster when the bubble burst. When that happened, both long-term owners and recent buyers found out almost overnight that their equity had disappeared and that they were “underwater,” meaning that the value of their property was less than the amount of their mortgage. This, multiplied by the hundreds of similar neighborhoods around the United States, was the principal reason for the drastic erosion of Black wealth in recent years. Black homeowners in a single census tract of the hundred or so in St. Louis lost $35 million in home equity between 2008 and 2016.19

The end of the bubble set off a cascade of destabilizing effects. Millions of homes whose owners were unable to either make escalating payments or refinance went into foreclosure. As the banks took and resold the properties, the new buyers were most often absentee investors taking advantage of discount prices offered by the banks. From 2000 to 2018 the number of homeowners in the South Shore tract dropped by 28 percent and in the Chatham tract by 47 percent. While many of the new landlords were responsible owners, others capitalized on the depressed prices of the houses to milk them for a few years and then walk away. Vacant properties more than doubled in both areas between 2000 and 2018. Meanwhile, many of the remaining homeowners left, losing confidence in the neighborhood and feeling helpless to undo the destruction taking place.

The continuing crisis of Black middle neighborhoods and of inner-ring suburbs is discussed further in chapters 11 and 13. Most of these neighborhoods have never fully recovered from the trauma of 2007–2009 despite a decade of steady national economic recovery since then. As we write this, though, in the midst of the COVID-19 pandemic, it inevitably raises questions about the potential effect of the pandemic on the future trajectory of urban and suburban neighborhoods after the immediate health effects have passed.

Obviously, it is impossible to know the answer at this point. The steps taken to manage the pandemic have led to a severe recession, and while the economy is recovering strongly, the nature of the recovery has prompted a dangerous level of inflation. Even if the recovery is sustained and inflation is brought under control, however, there is reason for substantial concern for those neighborhoods that were already struggling before the onset of the pandemic.

First, the restrictions imposed by the pandemic meant that thousands of businesses closed, putting their employees out of work. The effect was widespread unemployment; although it abated to some extent during the summer and fall of 2020, it was still high early in early 2021. While massive federal aid, culminating in the American Rescue Plan enacted early in 2021, has mitigated much of the hardship associated with mass unemployment, millions of families are still struggling.

The effect of economic hardship on people’s housing conditions is straightforward. Many homeowners have been unable to make mortgage payments, while many renters, particularly low-income renters with little or no financial cushion, have been unable to pay their rent. As of mid-November 2020, the US Census Bureau reported that more than one out of five tenants, including more than one of four tenants earning under $25,000 and nearly one out of three Black tenants, were behind on their rent.20 Recognizing the importance of keeping people in their homes during the pandemic, the federal government and most states imposed moratoria on evictions. The federal moratorium ended in mid-2021, while state moratoria ended early in 2022.

Rent arrearages place not only the tenant but also the landlord at risk. Many landlords, particularly the small mom-and-pop landlords who own most of the one- to four-family properties in lower-income areas, lack the cash reserves to absorb large amounts of rent arrears before having difficulty themselves making mortgage payments or paying property taxes. Federally funded emergency rental assistance programs will benefit many tenants and landlords, but at this point it is impossible to tell how many and how that will address a problem potentially affecting millions of tenants.

It is unknown how many tenants will eventually face eviction. Much will hinge on the extent to which cities and states will be able to quickly get the large amounts of federal funds into the hands of tenants and landlords. As of this writing at the beginning of 2022, the money is indeed getting into their hands, but it is becoming clear that it is far from enough. In the meantime, some tenants will probably work things out with their landlords. That number may be large, because some landlords may prefer to forgive rent arrears and retain a tenant who resumes rent payments rather than evict the tenant and face the uncertainty as well as the short-term loss of finding a new tenant. Landlord behavior is likely to vary with the strength of the local housing market; in a strong market, some may perceive the benefits of eviction as outweighing the risks, especially if it offers an opportunity to increase the rent. In the final analysis, however, widespread evictions and landlord insolvency, which can lead to deferred maintenance and potential abandonment, are potentially significant neighborhood destabilizing events over and above the personal hardships they create.

Foreclosures can have comparable effects, but for a variety of reasons they are not likely to be as great a problem, at least on a national level, as rental arrears. As with rental arrears, a variety of state and federal mortgage forbearance measures have been put into effect during the pandemic. As they end, however, a wave of foreclosures comparable to that of the foreclosure crisis appears unlikely. First, many lenders have already indicated that assuming the owner resumes payment, arrears will be added to the end of the loan term rather than coming due immediately. Second, in contrast to 2008 when millions of homes became underwater, housing prices have been rising steadily in most areas since 2020. As a result, most homeowners will have not only equity but also strong expectations of future appreciation, both of which they will want to preserve. Finally, most homeowners have deeper financial cushions than renters. Nationally, as of mid-November 2020, less than 1 percent of homeowners were in arrears on mortgage payments.21 Despite this the risk of foreclosures exists particularly in neighborhoods such as the Black middle neighborhoods discussed earlier, where both homeowner equity and expectations may be lower.

The third issue is the extent to which the pandemic has and will continue to affect small businesses. Small businesses, particularly those serving lower-income communities and those owned by people of color, are highly vulnerable to business downturns. Many businesses have closed due to the pandemic, and the risk that many will not reopen or, if they do, will fail to recover and subsequently close is high. A Federal Reserve study estimated that two hundred thousand more businesses than usual closed during the first year of the pandemic, disproportionately concentrated in the restaurant, service, and retail sectors.22 All of these not only employ large numbers of lower-wage workers but also represent a large share of the commercial activity in neighborhood-level commercial districts. Large numbers of permanent business closings in those areas will destabilize neighborhoods by reducing the incomes of many neighborhood residents and by undermining neighborhood anchor districts.

The effect of these factors will not fully become clear for some time. As we saw with the foreclosure crisis and the Great Recession, the long-term effects of the pandemic are likely to be uneven. Lower-income neighborhoods and their residents, especially areas with high shares of lower-income renters, however, may experience far worse effects than more affluent ones.

Closing Note

In later chapters we drill down into each of the different trends and dynamics we have touched upon in this chapter. But they raise many troubling concerns about the future of the American neighborhood. The first, as reflected in the effects of the foreclosure crisis, the Great Recession, and the COVID-19 pandemic, is the fragility of many neighborhoods and their susceptibility to exogenous shocks. Over the period covered in this chapter neighborhoods have suffered two such shocks, one of which is still ongoing as this is written, with profoundly destabilizing effects. Fragility is closely tied to economic conditions and race, the latter reflecting the extent to which the us housing market is racially divided, over and above its economic stratification.

Although the effect of the COVID-19 pandemic will gradually wane, it would be foolish to believe that there will be no future shocks. By their nature, shocks of this sort are what Nicholas Nassim Taleb has dubbed “black swans,” events that cannot be predicted yet can be expected to happen periodically.23 We cannot anticipate what the next black swan will be, whether it will be another pandemic, an economic crisis, an effect of climate change, or something that we cannot even imagine at this point. What we can anticipate is that shocks will happen and will affect the lives of people, their cities, and their neighborhoods in unpredictable ways. If both history and current experience are a guide, unless we fundamentally change the way we think about neighborhoods and what we do to help build their resilience, future crises will disproportionately harm lower-income people and their neighborhoods.

A second and even more fundamental question is whether the good neighborhood is becoming an elite good. The last two decades have seen an increasing pattern of neighborhood polarization, with elite gentrification existing side by side with increased concentrated poverty and the decline of middle neighborhoods. We are not suggesting that the decent yet unpretentious working-class or middle-class neighborhoods that formed such a central part of the American physical and social environment a few generations ago no longer exist. They do, but the evidence suggests that they are fewer than they once were, particularly in the urban areas where they were once the norm.

As millions of well-paying blue-collar jobs have been lost and economic inequality has grown, the middle class has shrunk. The neighborhood effects of that hollowing out have been exacerbated by the process of “economic sorting” described by Bischoff and Reardon in which neighborhoods are becoming increasingly homogenous economically, with more rich neighborhoods, more poor ones, and fewer in the middle.24 Both trends have eaten away at the universe of middle-income good neighborhoods. In older Rust Belt cities such as Chicago, the number of middle-income neighborhoods dropped by over one-third from 2000 to 2017. Even in Phoenix, which has seen a steady influx of middle-income households, the number of middle-income census tracts dropped by 10 percent. While we believe that a poor neighborhood can be a good neighborhood, such neighborhoods face daunting challenges. The question of whether we as a society can address those challenges and restore the opportunity for more people to live in good neighborhoods still remains to be answered.

Annotate

Next Chapter
6. Neighborhoods as Markets
PreviousNext
All rights reserved
Powered by Manifold Scholarship. Learn more at
Opens in new tab or windowmanifoldapp.org