At a transportation planning meeting several years ago, my suggestion that some government funds be allocated to public transit rather than new highways was immediately denounced by the mayor of a nearby small town as “social engineering.” The people of the Bay Area, and indeed of America, had chosen the highways and the suburbs; who was I to dare to subvert the democratic process? It appears to be generally assumed that the word suburbanization refers to a natural process and that the growth of the national auto infrastructure and the change in shopping patterns from small local stores to consolidated “big boxes” was somehow a natural evolutionary event that amounted to a national phenomenon. Indeed, if you look around, it certainly seems that way. After 50 years of suburbanization, most Americans live in suburbs, there are nearly as many cars as people in this country, and we drive about 21⁄2 trillion miles a year. These same people, after putting a little more thought into how things got this way, will generally say that our current situation is the result of millions of individual choices. After careful consideration of the available options, an overwhelming majority of Americans freely decided on a suburban house with a two-car garage, an hour-long daily commute, two car payments, and weekly trips to the nearest Costco. The collective social and environmental consequences of these choices are unfortunate but inevitable side effects. The fact is that there was no range of options to choose from, and people were not able to choose freely. Federal, state, and local governments, promoting the interests of financial institutions, merchants’ associations, corporations, and developers, created a legal and physical structure that provided overwhelming financial benefits for those who chose the suburbs, which funneled the most money into corporate pockets. These benefits were subsidized by those who were unwilling or unable to make the same choice: city dwellers, African Americans, and the poor (categories that soon substantially overlapped). Before World War II, people had more options. At that time, 40% of Americans still lived in rural areas. Small towns were still viable. And big cities offered several styles of living for the middle class: apartments, townhouses, detached houses, even the “streetcar suburbs” designed and developed by private transit companies. This system became destabilized during the 30 years of disinvestment in the 1920s, depression in the 1930s, and war in the 1940s. When the war was won and newly prosperous Americans were able to focus on rebuilding their lives and families, the federal government strongly influenced the way they went about it. Partly in response to a severe housing shortage, the federal government made changes in financial practice that made it easier for people to purchase houses on long-term credit, and set up programs to guarantee bank loans for their purchase. To ensure the safety of these guaranteed loans, the Federal Housing Administration (and later the Veterans Administration) used risk assessment maps inherited from the depression-era Homeowners Loan Corporation. These maps show a clear bias toward sprawl and social and racial segregation; areas that approached urban density or contained a racially or economically diverse population were by definition “declining” and a loan risk. This turned out to be a self-fulfilling prophesy as government and private investment money flowed into the “safe” (all-White, suburban) areas and was steered away from the “declining” (integrated, dense) areas. “Redlining” of working class, urban, or minority neighborhoods is supposed to be illegal now, but the effects of more than 50 years of stratified capital investment will not disappear overnight.