We investigate how basic probability inequalities can be extended to an imprecise framework, where (precise) probabilities and expectations are replaced by imprecise probabilities and lower/upper previsions. We focus on inequalities giving information on a single bounded random variable X, considering either convex/concave functions of X (Jensen's inequalities) or one-sided bounds such as (X≥c) or (X≤c) (Markov's and Cantelli's inequalities). As for the consistency of the relevant imprecise uncertainty measures, our analysis considers coherence as well as weaker requirements, notably 2-coherence, which proves to be often sufficient. Jensen-like inequalities are introduced, as well as a generalisation of a recent improvement to Jensen's inequality. Some of their applications are proposed: extensions of Lyapunov's inequality and inferential problems. After discussing upper and lower Markov's inequalities, Cantelli-like inequalities are proven with different degrees of consistency for the related lower/upper previsions. In the case of coherent imprecise previsions, the corresponding Cantelli's inequalities make use of Walley's lower and upper variances, generally ensuring better bounds.
Read full abstract