Abstract Background Covid-19 transformed healthcare and how we think about clinical diagnostics. People prefer at home testing options that enable them to focus on staying healthy at their own comfort and privacy. Fast and accurate lateral flow assays or rapid diagnostic test (RDT) meets that need, however ensuring high test specificity and accuracy are of utmost importance. Visual readout of RDTs is subjective and time consuming. With the current technological advancement, medical devices are smarter and can be used for automated readout. The core of such devices are the algorithms that process the information despite several limitations and unknown variations affecting input signals. Many such algorithms are developed and are used for similar applications. Here, a statistical model for any optical device that is designed to automatically read an RDT with one or more test lines is described. Methods The described LFA algorithm is a innovative way to analyze images acquired with Complementary Metal-Oxide-Semiconductor (CMOS) based optical system with low Signal-to-noise ratio (SNR), giving a semi quantitative result. This also takes care of variations in illumination sources with relatively reduced computation, by converting the images to a 1 Dimensional (1D) signal. Algorithm workflow: Image pre-processing: Affine transformations such as rotation, translation, etc. (depends on the inherent setup) are applied on the acquired images as a pre-processing step. Pattern matching techniques using Gaussian templates are applied to determine the window region of the LFA and precisely identify the test and control line regions. Image formatting: The sample region of the cassette is converted from RGB color space to the weighted gray (Gw) for digital interpretation of the primary biomarker present. Characterization of the Gw channel enables test peak detection with enhanced resolution. This helps in early call out of results and improved accuracy, thereby reducing chances of erroneous results. A second order Butterworth low pass filter is applied to reduce system noise and enhance SNR. The images are converted to 1D signal by averaging for mathematical read out. Mathematical interpretation: Maximum peak height (max) and minimum peak depth (min) of Gw channel values in the test line region (t-peak) are extracted from the 1D signal. The mathematical representation of the test line is calculated as: Test Peak Signal = {max(t-peak) - min(t-peak)} / Alpha; Normalization Factor, Alpha = (sum of all Gw channel values for any clear non-reactive part of the test device) / (number of Gw channel values) Results TPS values for various biomarkers from statistically significant replicates were computed at Low, Medium, and High levels. Two sample T-tests were performed, and respective p values are distinctly different as p-value<0.05. For a concentration level of Low to medium, p-value is 0.044 and for Medium - High p-value is 0.004. Conclusions This algorithm can run on any off-the-shelf available RDT coupled with a CMOS based optical system. The proposed algorithm can differentiate between low, medium, and high concentrations with high confidence. High throughput semiquantitative digital readouts are robust, accurate and are preferred for diagnostic test result interpretation.
Read full abstract