Bladder prolapse is a common clinical disorder of pelvic floor dysfunction in women, and early diagnosis and treatment can help them recover. Pelvic magnetic resonance imaging (MRI) is one of the most important methods used by physicians to diagnose bladder prolapse; however, it is highly subjective and largely dependent on the clinical experience of physicians. The application of computer-aided diagnostic techniques to achieve a graded diagnosis of bladder prolapse can help improve its accuracy and shorten the learning curve. The purpose of this study is to combine convolutional neural network (CNN) and vision transformer (ViT) for grading bladder prolapse in place of traditional neural networks, and to incorporate attention mechanisms into mobile vision transformer (MobileViT) for assisting in the grading of bladder prolapse. This study focuses on the grading of bladder prolapse in pelvic organs using a combination of a CNN and a ViT. First, this study used MobileNetV2 to extract the local features of the images. Next, a ViT was used to extract the global features by modeling the non-local dependencies at a distance. Finally, a channel attention module (i.e., squeeze-and-excitation network) was used to improve the feature extraction network and enhance its feature representation capability. The final grading of the degree of bladder prolapse was thus achieved. Using pelvic MRI images provided by a Huzhou Maternal and Child Health Care Hospital, this study used the proposed method to grade patients with bladder prolapse. The accuracy, Kappa value, sensitivity, specificity, precision, and area under the curve of our method were 86.34%, 78.27%, 83.75%, 95.43%, 85.70%, and 95.05%, respectively. In comparison with other CNN models, the proposed method performed better. Thus, the model based on attention mechanisms exhibits better classification performance than existing methods for grading bladder prolapse in pelvic organs, and it can effectively assist physicians in achieving a more accurate bladder prolapse diagnosis.
Read full abstract