Summary An entropy function is introduced to quantify the uncertainty in reservoir engineering parameters that are represented as random variables and described by probability distributons. When combined with a given algorithm and a Monte Carlo simulation routine, the concept can be used to estimate the degree of certainty required in input variables to yield a given certainty of the output parameters. The method is illustrated with a simplified model for water-alternating gas (WAG) miscible flooding. Two ranking schemes for five input variables are established and compared. The rankings provide an assessment of the overall model response with respect to a base case as related to average value and to degree of certainty of each input variable. Introduction Many reservoir engineering studies rely on data that contain some known or unknown error. Measurements are not always available, and key parameters are often estimated by other means. To account for this uncertainty, quantities can be described by probability distributions. Any parameter dud depends on one or more of these quantities has a probability distribution that is a function of the input probability distributions. Uncertainties often accumulate, leading to variances of output parameters that are unacceptably high. If the overall uncertainty of input variables is reduced, output variances will be lowered. The commitment of manpower and financial resources necessary to achieve this reduction, however, can be substantial. It is important, therefore, to estimate a priori the amount of reduction in uncertainty required for each input parameter to obtain a given increase in output certainty currently used sensitivity analyses do not yield this information. Combining an entropy function with a Monte Carlo simulation approach yields a new method that provides these estimates. S Value The notion of entropy, originally introduced in its differential form as S= /T in thermodynamics, has been successfully applied in other fields as a measure of randomness or disorder. In statistical physics, if quantilies the closeness of a system to equilibrium and plays a central role in describing the thermodynamic properties of ideal gases. In communication theory, it specifies the average amount of information transmitted per symbol. It is used in ergodic theory as a characterization of partitions and measure-preserving transformations. In terms of the underlying measure space, is defined similarly for all three cases.