2014 marks the centennial of the outbreak of World War I—the first war that saw the large-scale use of chemical weapons. Although poisons have been used in warfare for centuries, it was rapid advances in science and engineering and the rise of the modern chemical industry that made the mass production of toxic chemicals possible. The horrors of gas warfare led to the signing of the Geneva Protocol in 1925, which banned the use of “asphyxiating, poisonous or other gases” in war, but did not ban the production and stockpiling of chemical agents. Large arsenals containing nerve agents such as Sarin and VX were acquired during the years of the Cold War; these agents surpass the lethality of traditional World War I agents more than a thousandfold. Tragically, chemical weapons were also used over much of the past century, most extensively during the Iran–Iraq War. These developments persuaded the international community that it was time to negotiate a comprehensive ban on chemical weapons that would ultimately lead to a strong, universal norm against these types of armaments. This was a significant turning point that culminated in the adoption and signing of the Chemical Weapons Convention (CWC) in 1993 and its entry into force in 1997 [1].