. In this work, we deal with some Bayesian inference problems in the presence of right censored data. First, we propose a dual ϕ−divergence Bayes type estimators for parametric models and we establish their asymptotic normality. To establish this result, we need a uniform strong law of large numbers that we prove as well. Then, we consider the problem of prior distributions construction for model selection using ϕ−divergences. Finally, we consider the problem of the predictive density estimation on the basis of ϕ−divergences. We apply an expansion result of the generalized Bayesian predictive density on two parametric models widely used in survival analysis, namely the Weibull and the inverse Weibull model, under right censoring. We also check the performances of our proposed methods through simulations and real data applications. The results of these studies show that our proposed dual ϕ−divergence Bayes type estimators are more robust than other Bayesian estimators. Moreover, the generalized Bayesian predictive density performs better than the classical estimative density especially for the inverse Weibull model.