Abstract

This paper deals with the bounds of the empirical robust Kullback-Leibler (KL) divergence problem that is proposed in the literature to be used for universal hypothesis testing (UHT). The original problem formulation relies on the bounds derived from the Lévy ball. New bounds are proposed, and they are shown to be more tight. A new parameter is also introduced to be used for modifications of the new and existing bounds. Then, a computational study is devised to evaluate the performance of the modified test in terms of power for fixed sample sizes. Based on the computational results, we can conclude that the new proposals are promising by increasing adaptability of the robust/composite hypothesis testing.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call