Abstract
This study delves into the factors that contribute to the severity of single-vehicle crashes, focusing on enhancing both computational speed and model robustness. Utilizing a mixed logit model with heterogeneity in means and variances, we offer a comprehensive understanding of the complexities surrounding crash severity. The analysis is grounded in a dataset of 39,788 crash records from the UK’s STATS19 database, which includes variables such as road type, speed limits, and lighting conditions. A comparative evaluation of estimation methods, including pseudo-random, Halton, and scrambled and randomized Halton sequences, demonstrates the superior performance of the latter. Specifically, our estimation approach excels in goodness-of-fit, as measured by ρ2, and in minimizing the Akaike Information Criterion (AIC), all while optimizing computational resources like run time and memory usage. This strategic efficiency enables more thorough and credible analyses, rendering our model a robust tool for understanding crash severity. Policymakers and researchers will find this study valuable for crafting data-driven interventions aimed at reducing road crash severity.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Similar Papers
More From: Accident Analysis & Prevention
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.