Abstract

Maximum likelihood estimation (MLE) of the four-parameter kappa distribution (K4D) is known to be occasionally unstable for small sample sizes and to be very sensitive to outliers. To overcome this problem, this study proposes Bayesian analysis of the K4D. Bayesian estimators are obtained by virtue of a posterior distribution using the random walk Metropolis–Hastings algorithm. Five different priors are considered. The properties of the Bayesian estimators are verified in a simulation study. The empirical Bayesian method turns out to work well. Our approach is then compared to the MLE and the method of the L-moments estimator by calculating the 20-year return level, the confidence interval, and various goodness-of-fit measures. It is also compared to modeling using the generalized extreme value distribution. We illustrate the usefulness of our approach in an application to the annual maximum wind speeds in Udon Thani, Thailand, and to the annual maximum sea-levels in Fremantle, Australia. In the latter example, non-stationarity is modeled through a trend in time on the location parameter. We conclude that Bayesian inference for K4D may be substantially useful for modeling extreme events.

Highlights

  • Introduced by Hosking [1], the four-parameter kappa distribution (K4D) is a generalized form of some frequently used distributions such as the generalized Pareto, logistic, Gumbel, and extreme value (GEV) distributions

  • A feature of P4 is that the mean values of k and h are estimated by L-moments estimation (LME) from data, which can be viewed as an empirical Bayes approach

  • The prior P4 results in low RMSEQ values over all combinations of k and h. This is possibly because the mean values of k and h are estimated by LME from data

Read more

Summary

Introduction

Introduced by Hosking [1], the four-parameter kappa distribution (K4D) is a generalized form of some frequently used distributions such as the generalized Pareto, logistic, Gumbel, and extreme value (GEV) distributions. We found in this study that the MLE of the K4D with a small sample produces relatively large variances of the estimated parameters compared to the LME. In dealing with extreme events such as annual maximum daily precipitation, seven-thousand three-hundred five observations over 20 years are reduced to only 20 values This loss of information can often lead to unreliable model estimates, especially for very high quantiles, which may have large variance. Bayesian inference offers an advantage in extreme value analysis with small samples as it allows the incorporation of prior knowledge into the information provided by observed data to improve statistical inference. The technical specifics and some figures are provided in the accompanying

Four-Parameter Kappa Distribution
Maximum Likelihood Estimation
L-Moments Estimation
Bayesian Inference
Computation by Markov Chain Monte Carlo
Prior Specification
Simulation Study
Applications to Real-World Data
Methods
Findings
Conclusions and Discussion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.