Set Phasers on Stun is a fun read for any technology buff. The author, Steven Casey, presents 20 real stories about adverse incidents involving technology. The stories come from different industries including maritime, aviation, energy, and healthcare. The common plot line in each story is a tragic accident in a complex system or system of systems where a combination of unintended, unanticipated bad things coincided to create the perfect storm. The purpose of putting these stories together in this fascinating compilation is to demonstrate “how technological failures result from incompatibilities between the way things are designed and the way people actually perceive, think, and act.” In short, each story deals with user interface, design-induced human error.At first glance, some of the examples seem like pure human error, such as the ferry crew that forgot to close the cargo doors before departing to cross the English Channel. A few are gruesome and not for the faint of heart, including the nuclear reactor engineer who was impaled to the ceiling from a flux-measuring rod inside a reactor's coolant channel. All bring home the point that reliable, safety-critical technology needs to be designed from a systems safety perspective that minimizes incompatibilities between the human actor and the technology.As a non-engineer, what I appreciated most about the book was the ease of reading, and learning about these real incidents. Technology experts will enjoy the stories because of the technology angle.I had some difficulty figuring out all of the causes of each incident, even though I personally investigated numerous workplace-technology accidents back in my early years of law practice. The author assumes a certain level of expertise and does not explain all of the things that were at play.The book would be great for a systems or human factors engineering class (or for pulling out as a final exam essay scenario), where students have to figure out all of the causes. The book would have been stronger if it explained more about all of the causes, because it's too easy for each of us to apply our narrow lens to force a single causal conclusion (human error) prematurely.Overall, I recommend the book for all of us lifelong students of technology safety because, if nothing else, it will make you think differently about the “stupidity” of the human who forgot to “close the cargo doors” in your healthcare organization. Product design teams could learn a lot from this book about the criticality of the human interface and the risk of assuming you know how the humans will perceive, think, and act (after all, you are one). Healthcare technology management professionals could learn a lot as well, because you are planning, purchasing, installing, and maintaining technology, and advising clinicians on its use. These professionals should be looking at the human interface on their checklist of essential ingredients for each new product purchased.For those who are eager to learn more about systems engineering and systems safety, AAMI has posted an excellent bibliography of additional reading materials that I believe is worth your time at: www.aami.org/interoperability/Materials/Systems_Bibliography.pdf.