Abstract
In controlled experiments, achieving covariate balancing across all groups is crucial as it ensures that the estimated treatment effects are not confounded by the effects of covariates. This study proposes a mixed-integer nonlinear programming model to address the covariate balancing problem. Specifically, we introduce a new covariate imbalance measure, which is the maximum discrepancy in both the first and second central moments between any two groups. The second central moment can effectively capture the correlation of covariates in a physical sense, which is crucial for partitioning high-dimensional samples. A mixed-integer nonlinear programming model is constructed to minimize the proposed measure to obtain the optimal partitioning results. The nonlinear model is then linearized to accelerate the optimization process. We conduct computational experiments based on simulated datasets, including one-dimensional, two-dimensional, and three-dimensional Gaussian distributed samples, and a real clinic trial dataset. Compared to the conventional discrepancy-based method, our method achieves a 54.81% and a 40.6% reduction in the maximum discrepancy of partitioning results in the two-dimensional simulated Gaussian samples and the real clinic trial dataset, respectively. These results demonstrate the superiority of the proposed model in partitioning high-dimensional samples with correlated covariates compared with the conventional discrepancy-based method.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.