In the half century following the Civil War, the mechanisms and institutions of production were dramatically transformed in the United States. In 1870, only a small percentage of the work force-workers engaged in the manufacture of textiles, guns, sewing machines, reapers, locks, and a few other items-could claim firsthand production experience in the industrial revolution; most nonfarm workers were artisans, plying their trades in small shops under employers who often worked beside them on similar tasks. By 1920, large industrial enterprises were commonplace, employers seldom had any personal knowledge of their employees, and much of the skill had been removed from the work process. Hundreds of jobs, including cigar rolling, coal mining, shoe lasting, granite cutting, meat packing, and glass making, were mechanized and/or electrified. Hundreds of others, from the machine shops to the new automobile factories of the 1910s, were divided and subdivided; a steelworker no longer made "steel," and a carpenter no longer made "furniture." Thousands of other workers-railroad workers, boilermakers, foundrymen-were employed in making or using machines, or in processes that required recently developed chemical processes. Other manufacturing processes remained technically unchanged, but working conditions were influenced by dense urban crowding. Not all these changes had negative consequences for the work environment. The larger factories built late in the period tended to be better lighted, for example, than those constructed in the mid-nineteenth century. On the whole, however, these changes increased health and safety hazards. Although not all the new occupations were dangerous or unhealthful, many involved greater risks than artisanship or farming. Workers in textile factories inhaled cotton lint. Foundry workers, subject to dramatic changes in temperature, contracted tuberculosis.' Machinery and electricity caused thousands of accidents and deaths every year. The division of labor bored and numbed workers. Phosphorous, mercury, and lead-increasingly common ingredients in inWILLIAM GRAEBNER is professor of history, SUNY-Fredonia. A version of this paper was presented to The Hastings Center's research group on Ethics and Occupational Health sponsored by The Field Foundation and the NSF program on Ethics and Values in Science and Technology, grant # RII-8107018. dustrial processes-proved deadly or disfiguring for workers who handled them. The tiny urban basementshundreds of them in Manhattan alone-in which bakeries were often located were hospitable environments for rodents and disease.2 By the turn of the century, all of this added up to what can only be described as a crisis. Take coal mining as an example. The first near-disaster in coal mining did not occur until 1856, when four men were buried in a mine and ultimately rescued. The first major disaster occurred in the anthracite region in 1869, and within two decades fourteen states had experienced at least one serious explosion. Still, nothing the nineteenth century could produce compared with the macabre record of the twentieth. In one particularly bad month in 1907, five major disasters hit the coal mines, killing 34, 361, 57, 239, and 11 miners. The railroads were not much safer. Also in 1907 Collier's ran a two-page picture spread dealing with twenty-four railroad accidents that had killed 188 persons in a thirty-day period.3 That this crisis was not restricted to the nation's most dangerous industries is evident in the dramatic growth in accident-related civil litigation at every level of the court system in the decades after