Following the recent leap in biotechnologies and particularly in high-throughput sequencing techniques, environmental DNA and RNA (eDNA and eRNA) are increasingly being used for biodiversity assessments and monitoring of complex ecosystems – lakes, streams and coastal waters. Growing interest and affordability of eDNA/eRNA based tools have led to the emergence of manifold protocols for capturing genetic material from variable biological matrices, e.g. soil, water, feces, biofilms, etc. The variability in eDNA and eRNA material (ranging from free-floating molecules to cellular complexes to intact organisms) and its patchy distribution in the environment may substantially affect effective capture from the environment. Therefore, it is extremely challenging for stakeholders to standardize or choose optimal protocols, impeding incorporation of eDNA/eRNA methods into routine monitoring and surveillance programs. Although there is still no consensus on the standardized workflow for processing eDNA/eRNA samples, the common practice for water samples is to concentrate nucleic acids (NA) via filtration, ranging from 0.22 to 20-micron pore size. Although using the smallest pore is assumed efficient for NA capture from a wide range of material (including sub-cellular particles), a trade-off between detection of the meaningful molecular signal and time(cost)-efficiency is needed. Using finer pore membranes increases the likelihood of clogging and prohibits processing larger volumes of water, thus reducing the chances of detecting rare biodiversity. Moreover, large sample volumes may be accompanied by increased concentrations of inhibitory substances (e.g. humic compounds), suppressing target molecular signal. Comparative studies involving formal cost-efficiency assessments are lacking, restricting informed decision-making around the optimized sampling approach for addressing a particular research or surveillance question. Identifying the optimal combination of time effort and signal detection efficiency is particularly crucial for targeted surveillance, e.g. detection and monitoring of nuisance organisms, endangered and indicator taxa or other species of particular economic or cultural importance. Although it has been previously shown that larger pore filters can be as efficient for species detection from waterborne eDNA, more data are needed on the amount and type of NA material capture and loss. Here, we present a comparison study using an easily cultured microalgal species (Alexandrium pacificum) as a proxy to test the effectiveness of different filter membranes (cellulose acetate membranes of 5 μm, 1.2 μm or 0.45 μm pore size, and positively charged nylon membrane with 1.2 μm pore size) in the context of targeted species detection. We performed an efficiency analysis to identify the method that delivered the optimal use of resources. A tiered experimental design was applied to: i) assess the impact of membranes on capturing various fractions of target eDNA/eRNA (intact cells, partially lysed cells, naked NAs) spiked into pre-filtered and ambient environmental seawater and ii) establish efficiency and utility of different membranes in terms of optimizing the performance – maximized output (capture of target eDNA and eRNA) balanced against minimized time and cost input. The results showed no statistically significant difference between membranes for capturing DNA signal from intact and partially lysed cell treatments. However, positively charged nylon membranes were more efficient in capturing naked NAs, as well as RNA from partially lysed cells. In terms of time effort and volume processed, higher efficacy was reported for the larger pore size cellulose membranes. However positively charged nylon consistently performed better for capturing RNA signal across treatments.
Read full abstract