Abstract

ObjectiveWe aim to 1) design an evaluation framework to examine the accuracy of automatic privacy auditing tools, 2) apply the evaluation method at a hospital to validate the performance of an auditing tool that uses a machine learning algorithm to automate user access auditing, and 3) recommend further improvements in auditing for the hospital. Materials and methodsUsing the black box method of user acceptance testing, we have designed an evaluation framework consisting of appropriate and inappropriate behaviour scenarios to examine the privacy auditing tools. The scenarios were designed from clinical and non-clinical hospital staff perspective, taking expert opinions from the privacy officers and considering examples from the Information and Privacy Commission (IPC) and were tested using Mackenzie Richmond Hill Hospital’s data. ResultsThe case study using this evaluation framework found that on average 98.09 % of total accesses of the hospital were identified as appropriate and the tool was unable to explain the remaining 1.91 % of accesses. In addition, a statistically significant (P < 0.05) increasing trend on categorizing appropriate accesses by the tool have been observed. Furthermore, an analysis of unexplained accesses revealed the contributing factors and found issues related to hospital workflows and data quality (information was missing about staff roles and departments). ConclusionGiven that adoption of these machine learning tools is increasing in hospitals, this research provides an evaluation framework and an empirical evidence on the effectiveness of automated privacy auditing and detecting anomalies for dynamic hospital workflows.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call