Artificial neural networks (ANNs) are powerful empirical approaches used to model databases with a high degree of accuracy. Despite their recognition as universal approximators, many practitioners are skeptical about adopting their routine usage due to lack of model transparency. To improve the clarity of model prediction and correct the apparent lack of comprehension, researchers have utilized a variety of methodologies to extract the underlying variable relationships within ANNs, such as sensitivity analysis (SA). The theoretical basis of local SA (that predictors are independent and inputs other than variable of interest remain “fixed” at predefined values) is challenged in global SA, where, in addition to altering the attribute of interest, the remaining predictors are varied concurrently across their respective ranges. Here, a regression-based global methodology, state-based sensitivity analysis (SBSA), is proposed for measuring the importance of predictor variables upon a modeled response within ANNs. SBSA was applied to network models of a synthetic database having a defined structure and exhibiting multicollinearity. SBSA achieved the most accurate portrayal of predictor-response relationships (compared to local SA and Connected Weights Analysis), closely approximating the actual variability of the modeled system. From this, it is anticipated that skepticisms concerning the delineation of predictor influences and their uncertainty domains upon a modeled output within ANNs will be curtailed.
Read full abstract