Abstract

This workshop was organized by INCF in Stockholm, Sweden, 18 - 19 June 2012. To ensure that research results can be trusted, it is essential to use methods that are validated. This holds both for analysis and sharing of data, and has become particularly pertinent in the context of the large-scale concerted brain projects that are presently emerging both in Europe and in the US. This workshop brought together scientists concerned with the validation of methods for data analysis from different perspectives. The workshop was motivated by a collaboration involving members of the Norwegian, Polish and German Nodes on establishing a community site for the evaluation of spike-sorting methods (spike.g-node.org). However, the need for validation of data analysis methods is not restricted to spike sorting, but pertains to all physiological and anatomical measurement methods used in neuroscience. Besides spike sorting of extracellular recordings, the workshop addressed the extraction of spikes from two-photon calcium imaging, methods for analysis of local-field potentials, and statistical analysis of spike trains. In addition, management and documentation of analysis workflows, which are crucial for validation of complex multi-stage analyses and thus an essential element of reproducible science, were discussed. An important additional measurement technique, functional magnetic resonance imaging (fMRI), was not considered extensively in the workshop, but it was understood by the workshop participants that the issues, problems and needs identified for the other fields, as well as the conclusions and recommendations, would equally apply to the field of fMRI. Typically, the efforts to define validation procedures for analysis methods, including the collection of benchmark data, start in single labs. However, they should be made available for use in the wide community. As requirements, benchmarks must be: • Broadly accepted by a wide range of laboratories • Available to these laboratories • Easily evaluated The workshop participants discussed what is necessary to bring a validation effort from a prototype-like state, typically achieved in the initiating lab, to a community resource. This process must involve the initiating scientists, but also the community, and ideally an organization that has built up the expertise to support this process. An example of this scheme is the spike-sorting validation project which has been enabled by a close collaboration of the involved scientists with the German INCF Node. The lesson learned from this example is that such a task should be taken on at the scale of an INCF program with expertise and support built up at the secretariat. Discussing the different examples, the workshop made clear that methods validation is at different stages for the various measurement techniques. Thus, specific recommendations differed, but overall the clear picture emerged that there is an unequivocal need for benchmarking activities. The participants agreed on the following overall key recommendations for supporting the various method validation initiatives: 1. Assure development and maintenance of the web site for validation of algorithms for spike sorting of electrical recordings hosted by the G-Node (spike.g-node.org). 2. Use the same technical resources to develop, host and maintain a corresponding website for validation of spike-detection algorithms for calcium imaging data. 3. Develop extensible framework allowing for validation of methods of analysis of other types of data. 4. Involve the community in the development of benchmarks and other means to validate methods for analysis. 5. Initiate and support training activities to educate users in methods validation. 6. Gather experts and users to discuss workflow standardization, and start activities to support reproducibility in data analysis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call