Abstract

Objective. We present a framework to objectively test and compare stimulation artefact removal techniques in the context of neural spike sorting. Approach. To this end, we used realistic hybrid ground-truth spiking data, with superimposed artefacts from in vivo recordings. We used the framework to evaluate and compare several techniques: blanking, template subtraction by averaging, linear regression, and a multi-channel Wiener filter (MWF). Main results. Our study demonstrates that blanking and template subtraction result in a poorer spike sorting performance than linear regression and MWF, while the latter two perform similarly. Finally, to validate the conclusions found from the hybrid evaluation framework, we also performed a qualitative analysis on in vivo recordings without artificial manipulations. Significance. Our framework allows direct quantification of the impact of the residual artefact on the spike sorting accuracy, thereby allowing for a more objective and more relevant comparison compared to indirect signal quality metrics that are estimated from the signal statistics. Furthermore, the availability of a ground truth in the form of single-unit spiking activity also facilitates a better estimation of such signal quality metrics.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call