A multilevel logistic regression model demonstrating high correlations among predictor variables is susceptible to multi-collinearity. Multi-collinearity significantly impacts the robustness and interpretability of multilevel non-linear models. In multilevel non-linear models, the effects of multi-collinearity can be amplified, leading to distorted parameter estimates and inflated standard errors. This phenomenon contributes to an escalation in the variances of parameter estimates, thereby resulting in inaccurate inferences regarding the relationships between the response and explanatory factors. The primary objective of this study is to investigate the impact of multi-collinearity on multilevel non-linear models. The research aims to assess whether the quantity of independent variables influences the Multilevel Variance Inflation Factor and to explore the effect of altering the correlation degree at one level on multi-collinearity within a multilevel non-linear model. Additionally, the research seeks to determine how multi-collinearity affects the standard errors of parameters in a multilevel non-linear model. In a 2-level logistic regression, a binary variable was the dependent variable, while pre-established standard variables functioned as regressors. The Monte Carlo analysis incorporated three distinct correlation strengths (0.2, 0.5, and 0.9) and sample sizes (500, 100, and 30). The Multilevel Variance Inflation Factor was employed for multi-collinearity diagnosis. The outcomes revealed that, within the logistic multilevel regression model, an increase in sample size correlated with a reduction in multi-collinearity. Notably, the influence of multi-collinearity on standard errors in a multilevel non-linear model was more pronounced. It was observed that increasing the sample size remains an effective strategy to mitigate multi-collinearity errors in a multilevel non-linear model. This approach is particularly crucial due to the reliance on maximum likelihood estimation in logistic regression, as opposed to ordinary least squares (OLS) regression, which contrasts with the methodology of OLS regression.