The physical and the cognitive mesh and clash in a dynamic syncretism that includes, not only the overwhelming influences of culture and path-dependent historical trajectory, but the inevitable burdens of Clausewitzian fog and friction. Such factors sculpt conflict at all scales and levels of organization. We show how institutional cognition – always and inevitably embodied – even when ‘enhanced’ by artificial intelligence or related machineries, must suffer significantly increased probability of failure under conditions of fog and friction, particularly for ‘wickedly hard’ tactical, operational, and strategic problems not amenable to straightforward, if difficult, engineering solutions. Perhaps not surprisingly, problems for which ‘big data’ optimization fails, particularly at the strategic level, enter realms of subtlety parallel to those that confound the understanding and representation of individual human consciousness. In particular, we develop probability models of cognitive failure on wickedly hard problems that may be convertible to statistical tools for empirical study and limited control of the phenomena.