Abstract
Uroflowmetry (UFM) is a clinical method for non-invasive study of the urodynamics of the lower urinary tract, which can be used as a screening method. Not always quantitative values of indicators that are within the age norm indicate the absence of urination disorders. In the interpretation of uroflowgrams, great importance is attached to the graphic type of curves. Several methods of automatic classification of uroflowgrams proposed earlier, including the use of neural networks, determined a limited number of types of uroflowgrams and had insufficiently high accuracy. The aim of the study is to improve computer methods for processing UVM results using neural networks and to create a method that makes it possible to conduct screening studies of urination and determine up to nine types of uroflowgrams. To develop a system for recognizing UFM data based on a neural network classifier, a set of 7843 UFM results was used. The data were classified into 9 types based on the study of scientific literature and many years of our own experience in conducting UFM. The UFM results were randomly divided into training and test samples in a ratio of 70% and 30%. The system was tested on 2352 uroflowgrams. To ensure that the results obtained were independent of the partitioning of the dataset, we used a sequential random sampling validation. The age of the patients ranged from 18 to 90 years. Uroflowmetry was performed using a Potok-K uroflowmeter (developed by O.Ye. Kvyatkovsky). As a result of testing various variants of neural networks, we have chosen a five-layer architecture of the Fully Convolutional Network (FCN). Improvements have been made to its original architecture. In addition to the quantitative parameters of the UFM, the sex and age of the patient were taken into account, and the percentiles of the nomograms of the maximum and average volumetric urine flow rate were calculated. A special feature was that, in addition to the quantitative parameters of the UFM, the entire graph of the volumetric flow rate of urine during urination was fed to the input of the neural network. In the process of improving the classification system, the share of correct answers was increased from 82.9% at the beginning to 93.4% in the final version. The method of automatic classification with the allocation of 9 types of uroflowgrams provides the ability to determine normal urination with high accuracy – 96.3%. Among the pathological types of uroflowgrams, the accuracy of identification of intermittent and obstructive-intermittent urination, which most often determines the syndrome of detrusor sphincter dyssynergia, was 92.8% and 96.4%, rapid urination, which characterizes overactive bladder syndrome, was 93.3%, obstructive and obstructive-interrupted urination, which mainly determines infravesical obstruction – 90.2% and 91.3%, interrupted urination and urination with a high start – 92.3% and 80.8%. For screening urination, the positive is that the computer program allows you to identify the initial disorders of urination: uroflowgram type “inactive flow” (pre-obstructive urination), is determined with an accuracy of 92.3%. The results obtained have been tested on a large number of uroflowgrams and are sufficient for practical use. It is possible to issue automatic conclusions during mass screening uroflowmetric studies of the urodynamics of the lower urinary tract. Proposed by О.Ye. Kvyatkovsky, a computer program using neural networks saves time in the diagnostic process and makes the assessment of uroflowgrams more reliable.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.