Abstract

Event Abstract Back to Event How collaborative projects that involve complicated electrophysiological data sets profit from workflow design Michael Denker1*, Andrew Davison2, Markus Diesmann1 and Sonja Gruen1 1 Research Center Juelich, Institute of Neuroscience and Medicine (INM-6), Germany 2 UNIC, CNRS, France The recent years have seen a rapid increase in the complexity of electrophysiology experiments. This complexity arises firstly from the interest in simultaneously analyzing the activity recorded from large numbers of channels in order to investigate the role of concerted neural activity in brain function. These efforts have led to advances in data analysis methods [1] that exploit the parallel properties of such data sets [2]. A second source of complexity is in the sophistication of stimulus protocols. To take the visual system as an example, typical visual stimulation has progressed from simple moving bars or drifting gratings to natural movies, Gabor noise, apparent motion stimuli. In the somatosensory system, new technology now allows the entire rodent whisker array to be stimulated in essentially arbitrary patterns. However, an often neglected aspect of these technological advances is that both massively parallel data streams and highly complex stimuli place new demands on handling their complexity during all stages of the project [3]: from the initial recording, throughout the analysis process, to the final publication. Three factors contribute these new demands: First, the sheer quantity of data complicates the organization of data sources, and the resulting automatization of analysis steps renders the validation of interim and final results difficult. Second, modern analysis methods often require intricate, multi-layered implementations, leading to sophisticated analysis toolchains [4]. Third, a growing number of projects needs to be carried out in teams, within a laboratory or in collaborative efforts, requiring transparent workflows that guarantee smooth interaction. Taken together, the increase in complexity calls for a reevaluation of the ad-hoc traditional approaches to such projects. Can we derive general guiding principles that may be adopted for designs of efficient workflows? How could these improve our confidence in handling the data by providing better cross-validation of findings, reliably managing provenance data, and enabling tighter collaborative research, while at the same time leaving the scientist with the flexibility required for creative research? Although several projects are devoted to finding solutions for specific aspects of a workflow design (e.g., [5-7]), on a more general level there is lack of a thorough discussion on what goals are expected from a workflow, and which of these can be realistically addressed. Here, we summarize feedback received from experimenters and theoreticians that pinpoints the fundamental problems typically encountered in the analysis of high-dimensional electrophysiological data. Illustrated by examples from our own experience, we further show obstacles that prevent us from harmonizing workflows to common guidelines. For selected issues we draw parallels to other communities that are faced with similar problems (e.g., neuronal network modeling [8-9]; neuroimaging [10]). Lastly, we propose how existing concepts and software [9,11] could assist in practically implementing workflows that are tailored to the needs of a specific project, yet guarantee high standards by adhering to general guidelines of accepted best-practice. Acknowledgements: This project was supported by the European Union (FP7-ICT-2009-6, BrainScales).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call