Why a Good Verification System Can Give Ambiguous Evidence
The recent debate on ambiguity provides lessons for future arms control verification. American officials sought verification systems that returned unambiguous evidence about Soviet compliance, but a simple model of verification suggests that a verification scheme can be more ambiguous yet better. It may be more effective in deterring violations and avoiding false alarms. The reason is as follows: should the inspecting party come upon suspicious evidence, it will, on the one hand, have a reason to trust that evidence more, as it was returned by a more reliable verification system. On the other hand it will have a reason to be more sceptical that the other is violating since the other would probably not dare to cheat in the face of the improved verification technology. In some situations a reasonable inspector will regard the second factor as weightier than the first, and give lower credence to the evidence. Ambiguity in verification is a tricky notion and misunderstandings about it arise from two sources: from the vocabulary of verification, which suggests that one dichotomously "detects" or does "not detect" violation, when in fact evidence comes in gradations, and from the human tendency not to look at the situation from the other's viewpoint. The model uses game theory's logic to represent the strategic aspects of the situation, and has a mathematical feature different from past models, the notion of continuous degrees of evidence, to give a proper account of ambiguity. It also clarifies past technical studies of verification by locating them within the model's structure.