ObjectiveProbabilistic sensitivity analysis (PSA) is conducted to account for the uncertainty in cost and effect of decision options under consideration. PSA involves obtaining a large sample of input parameter values (N) to estimate the expected cost and effect of each alternative in the presence of parameter uncertainty. When the analysis involves using stochastic models (e.g., individual-level models), the model is further replicated P times for each sampled parameter set. We study how N and P should be determined. MethodsWe show that PSA could be structured such that P can be an arbitrary number (say, P=1). To determine N, we derive a formula based on Chebyshev’s inequality such that the error in estimating the incremental cost-effectiveness ratio (ICER) of alternatives (or equivalently, the willingness-to-pay value at which the optimal decision option changes) is within a desired level of accuracy. We described two methods to confirmed, visually and quantitatively, that the N informed by this method results in ICER estimates within the specified level of accuracy. ResultsWhen N is arbitrarily selected, the estimated ICERs could be substantially different from the true ICER (even as P increases), which could lead misleading conclusions. Using a simple resource allocation model, we demonstrate that the proposed approach can minimize the potential for this error. ConclusionsThe number of parameter samples in probabilistic CEAs should not be arbitrarily selected. We describe three methods to ensure that enough parameter samples are used in probabilistic CEAs.