The prediction in Bayesian framework is extended from the point of view of renormalization group. For this purpose, we first make it clear that the advantages of Bayesian statistical inference can be understood by an adaptive property of a long-distance length scale. This suggests a close connection of Bayesian statistical inference to renormalization group. Next, we show that a cumulative entropic error can be rewritten as an effective action, which directly leads to a renormalization group equation in non-parametric Bayesian statistical inference. As a result, we introduce a scaling part to a prior distribution, and determine it so that we can obtain better prediction performance. We discuss how prediction performance improves, taking an example of a density estimation problem.