Abstract

Summary As a generalization of the classical linear factor model, generalized latent factor models are useful for analysing multivariate data of different types, including binary choices and counts. This paper proposes an information criterion to determine the number of factors in generalized latent factor models. The consistency of the proposed information criterion is established under a high-dimensional setting, where both the sample size and the number of manifest variables grow to infinity, and data may have many missing values. An error bound is established for the parameter estimates, which plays an important role in establishing the consistency of the proposed information criterion. This error bound improves several existing results and may be of independent theoretical interest. We evaluate the proposed method by a simulation study and an application to Eysenck’s personality questionnaire.

Highlights

  • Factor analysis is a popular method in social and behavioural sciences, including psychology, economics and marketing (Bartholomew et al, 2011)

  • A joint maximum likelihood estimator is proposed in Chen et al (2019, 2020) that is easy to compute, and statistically optimal in the minimax sense when both the sample size and the number of manifest variables grow to infinity

  • Under a very general setting, we prove the consistency of the proposed information criterion when both the numbers of samples and manifest variables grow to infinity

Read more

Summary

Summary

As a generalization of the classical linear factor model, generalized latent factor models are useful for analysing multivariate data of different types, including binary choices and counts. This paper proposes an information criterion to determine the number of factors in generalized latent factor models. The consistency of the proposed information criterion is established under a high-dimensional setting, where both the sample size and the number of manifest variables grow to infinity, and data may have many missing values. An error bound is established for the parameter estimates, which plays an important role in establishing the consistency of the proposed information criterion. This error bound improves several existing results and may be of independent theoretical interest. Some key words: Generalized latent factor model; High-dimensional data; Information criteria; Joint maximum likelihood estimator; Selection consistency

Introduction
Joint-likelihood-based information criterion
Proposed information criterion
Theoretical results
Simulation
Application to Eysenck’s personality questionnaire
Further discussion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.