Abstract

Strategic situations create motivational biases that help to predict the type of errors intelligence communities are more likely to commit (Type I errors predict behavior never observed, while Type II errors fail to predict behavior later observed). When the dangers of inaction are low and the cost of action high, the intelligence community is more likely to fail to predict threats (Type II error). If the dangers of inaction are high and the costs of military action low, it is more likely to predict mistakenly threats never observed (Type I error). Studies of US and Israeli decision-making and analyses of two new experimental studies support this theory. The key is to recognize the incentives for error and to develop systems that, at worst, lead to intelligence errors (mistakes consistent with a state's national security needs) and not intelligence failures (errors contrary to national security requirements).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call