BackgroundDimension reduction methods do not always reduce their underlying indicators to a single composite score. Furthermore, such methods are usually based on optimality criteria that require discarding some information. We suggest, under some conditions, to use the joint probability density function (joint pdf or JPD) of p-dimensional random variable (the p indicators), as an index or a composite score. It is proved that this index is more informative than any alternative composite score. In two examples, we compare the JPD index with some alternatives constructed from traditional methods.MethodsWe develop a probabilistic unsupervised dimension reduction method based on the probability density of multivariate data. We show that the conditional distribution of the variables given JPD is uniform, implying that the JPD is the most informative scalar summary under the most common notions of information. B. We show under some widely plausible conditions, JPD can be used as an index. To use JPD as an index, in addition to having a plausible interpretation, all the random variables should have approximately the same direction(unidirectionality) as the density values (codirectionality). We applied these ideas to two data sets: first, on the 7 Brief Pain Inventory Interference scale (BPI-I) items obtained from 8,889 US Veterans with chronic pain and, second, on a novel measure based on administrative data for 912 US Veterans. To estimate the JPD in both examples, among the available JPD estimation methods, we used its conditional specifications, identified a well-fitted parametric model for each factored conditional (regression) specification, and, by maximizing the corresponding likelihoods, estimated their parameters. Due to the non-uniqueness of conditional specification, the average of all estimated conditional specifications was used as the final estimate. Since a prevalent common use of indices is ranking, we used measures of monotone dependence [e.g., Spearman’s rank correlation (rho)] to assess the strength of unidirectionality and co-directionality. Finally, we cross-validate the JPD score against variance–covariance-based scores (factor scores in unidimensional models), and the “person’s parameter” estimates of (Generalized) Partial Credit and Graded Response IRT models. We used Pearson Divergence as a measure of information and Shannon’s entropy to compare uncertainties (informativeness) in these alternative scores.ResultsAn unsupervised dimension reduction was developed based on the joint probability density (JPD) of the multi-dimensional data. The JPD, under regularity conditions, may be used as an index. For the well-established Brief Pain Interference Inventory (BPI-I (the short form with 7 Items) and for a new mental health severity index (MoPSI) with 6 indicators, we estimated the JPD scoring. We compared, assuming unidimensionality, factor scores, Person’s scores of the Partial Credit model, the Generalized Partial Credit model, and the Graded Response model with JPD scoring. As expected, all scores’ rankings in both examples were monotonically dependent with various strengths. Shannon entropy was the smallest for JPD scores. Pearson Divergence of the estimated densities of different indices against uniform distribution was maximum for JPD scoring.ConclusionsAn unsupervised probabilistic dimension reduction is possible. When appropriate, the joint probability density function can be used as the most informative index. Model specification and estimation and steps to implement the scoring were demonstrated. As expected, when the required assumption in factor analysis and IRT models are satisfied, JPD scoring agrees with these established scores. However, when these assumptions are violated, JPD scores preserve all the information in the indicators with minimal assumption.
Read full abstract