Abstract

Nanomaterial-based electronic sensors have demonstrated ultra-low detection limits, down to parts-per-billion (ppb) or parts-per-trillion (ppt) concentrations. However, these extreme sensitivities also make them susceptible to signal saturation at higher concentrations and restrict their usage primarily to low concentrations. Here, we report machine learning techniques to create a calibration method for carbon nanotube-based field-effect transistor (FET) devices. We started with linear regression, followed by regression splines to capture the non-linearity in the data. Further improvements in model performance were obtained with regression trees. Finally we lowered the model variance and further boosted the model performance by introducing random forest. The resulting performance as measured by R2 was estimated to be 0.8260 using out-of-bag error. The methodology avoids saturation and extends the dynamic range of the nanosensors up to 12 orders of magnitude in analyte concentrations. Further investigations of the sensing mechanism include analysis of feature importance in each of the model we tested. Functionalized nanosensors demonstrate selective detection of Hg2+ ions with detection limits 10−14.36±0.78 M, and maintain calibration to concentrations as high as 1 mM. Application of machine learning techniques to investigate which features in the FET signal maximally correlate with concentration changes provide valuable insight into the carbon nanotube sensing mechanism and assist in the rational design of future nanosensors.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call