Abstract

SummaryOne of the main activities in data‐intensive science is data analysis. Although there are many popular technologies that can assist scientists in various isolated aspects of data analysis, supporting analysis processes in holistic ways that promote system interoperability, integration and automation, as well as scientific reproducibility and efficient data handling, presents many challenges. A common solution to address these challenges is to find efficient ways of integrating various existing technologies together to meet the analysis needs of scientists (which is similar to the idea behind science gateways). We believe that this solution is essentially an exercise in software design; and in many situations, these challenges should be tackled from a software design perspective. Consequently, this paper reviews different architectural design approaches that can be used to address these challenges and proposes a service‐oriented framework called the Ad Hoc Data Grid Environment, which consists of an architectural pattern and its associated operational guidelines. The guidelines prescribe a number of activities based on an iterative decomposition approach to produce and evolve software architectures according to constantly changing user needs. The framework is demonstrated on a case study involving analysis processes required for conducting financial event studies. Copyright © 2014 John Wiley & Sons, Ltd.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call