Abstract
ABSTRACTThe use of data and statistics along with computational systems heralded the beginning of a quantitative revolution in Geography. Use of simulation models (Cellular Automata and Agent‐Based Models) followed in the late 1990s, with ontology and epistemology of complexity theory and modelling being defined a little less than two decades ago. We are, however, entering a new era where sensors regularly collect and update large amounts of spatio‐temporal data. We define this ‘Big Data’ as geolocated data collected in sufficiently high volume (exceeding storage capacities of the largest personal hard drives currently available), that is updated at least daily, from a variety of sources in different formats, often without recourse to verification of its accuracy. We then identify the exponential growth in the use of complexity simulation models in the past two decades via an extensive literature review (broken down by application area), but also notice a recent slowdown. Further, a gap in the utilisation of Big Data by modellers to calibrate and validate their models is noted, which we attribute to data availability issues. We contend that Big Data can significantly boost simulation modelling, if certain constraints and issues are managed properly.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have