Abstract

Before the founding of the Society for Risk Analysis (SRA) in 1980, food safety in the United States had long been a concern, but there was a lack of systematic methods to assess food-related risks. In 1906, the U.S. Congress passed, and President Roosevelt signed, the Pure Food and Drug Act and the Meat Inspection Act to regulate food safety at the federal level. This Act followed the publication of multiple reports of food contamination, culminating in Upton Sinclair's novel The Jungle, which highlighted food and worker abuses in the meatpacking industry. Later in the 20th century, important developments in agricultural and food technology greatly increased food production. But chemical exposures from agricultural and other practices resulted in major amendments to federal food laws, including the Delaney Clause, aimed specifically at cancer-causing chemicals. Later in the 20th century, when quantitative risk assessment methods were given greater scientific status in a seminal National Research Council report, food safety risk assessment became more systematized. Additionally, in these last 40 years, food safety research has resulted in increased understanding of a range of health effects from foodborne chemicals, and technological developments have improved U.S. food safety from farm to fork by offering new ways to manage risks. We discuss the history of food safety and the role risk analysis has played in its evolution, starting from over a century ago, but focusing on the last 40 years. While we focus on chemical risk assessment in the U.S., we also discuss microbial risk assessment and international food safety.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call