Abstract
For an unknown continuous distribution on the real line, we consider the approximate estimation by discretization. There are two methods for discretization. The first method is to divide the real line into several intervals before taking samples (“fixed interval method”). The second method is to divide the real line using the estimated percentiles after taking samples (“moving interval method”). In either method, we arrive at the estimation problem of a multinomial distribution. We use (symmetrized) f-divergence to measure the discrepancy between the true distribution and the estimated distribution. Our main result is the asymptotic expansion of the risk (i.e., expected divergence) up to the second-order term in the sample size. We prove theoretically that the moving interval method is asymptotically superior to the fixed interval method. We also observe how the presupposed intervals (fixed interval method) or percentiles (moving interval method) affect the asymptotic risk.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.