Abstract

Although generalized linear models (GLM's) are an attractive and widely used class of models, they are limited in the range of density shapes that they can provide. For instance, they are unimodal exponential families of densities in the response variable with tail behavior determined by the implicit mean-variance relationship. Dirichlet process (DP) mixing adds considerable flexibility to these models. Using such mixing, we develop models that we call DPMGLM's, which still retain the GLM character with regard to the mean. Overdispersed GLM's (OGLM's) provide an alternative class of models to cope with extra variability in samples. We show that how OGLM's may be DP mixed, leading to what we call DPMOGLM's. These models are extremely rich. Moreover, recent computational advances enable them to be fitted straightforwardly. We illustrate this with both simulated and real datasets. We also address the question of choosing between the GLM, OGLM, DPMGLM, and DPMOGLM. Finally, we consider extensions, by DP mixing, of hierarchical or multistage GLM's.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.