A random differential equation, or stochastic differential equation with parametric uncertainty, is a classical differential equation whose input values (coefficients, initial conditions, etc.) are random variables. Given data, the probability distributions of the input random parameters must be appropriately inferred, before proceeding to simulate the model’s output. This task is called inverse uncertainty quantification. In this paper, the goal is to study the applicability of the Bayesian bootstrap to draw inferences on the posterior distributions of the parameters, by resampling the residuals of the deterministic least-squares optimization with Dirichlet weights. The method is based on repeated deterministic calibrations. Thus, to alleviate the curse of dimensionality, the technique may be combined with the principle of maximum entropy for densities, when there are some parameters that are not optimized deterministically. For illustration of the methodology, two case studies on important health topics are conducted, with stochastic fitting to data. The first one, on past alcohol consumption in Spain, taking social contagion into account. The second one, on HIV evolution considering CD4+ T cells and viral load, with a patient in clinical follow-up. All these applied models are built from a compartmental viewpoint, with a randomized basic reproduction number that controls the long-term behavior of the system.