In recent years, Finger Vein Image Quality Assessment (FVIQA) has been recognized as an effective solution to the problem of erroneous recognition resulting from low image quality due to false and missing information in finger vein images, and has become an important part of finger vein recognition systems. Compared to traditional FVIQA methods that rely on domain knowledge, newer methods that reject low-quality images have been favored for their independence from human interference. However, these methods only consider intra-class similarity information and ignore valuable information from inter-class distribution, which is also an important factor in evaluating the performance of recognition systems. In this work, we propose a novel FVIQA approach, named IIS-FVIQA, which concurrently takes into account the intra-class similarity density and inter-class similarity distribution distance within recognition systems. Specifically, our method generates quality scores for finger vein images by combining the information entropy of intra-class similarity distribution and Wasserstein distance of inter-class distribution. Then, we train a regression network for quality prediction using training images and corresponding quality scores. When a new image enters the recognition system, the trained regression network directly predicts the quality score of the image, making it easier for the system to select the corresponding operation based on the quality score of the image. Extensive experiments conducted on benchmark datasets demonstrate that the IIS-FVIQA method proposed in this paper consistently achieves top performance across multiple public datasets. After filtering out 10% of low-quality images predicted by the quality regression network, the recognition system’s performance improves by 43.96% (SDUMLA), 32.23% (MMCBNU_6000), and 21.20% (FV-USM), respectively. Furthermore, the method exhibits strong generalizability across different recognition algorithms (e.g., LBP, MC, and Inception V3) and datasets (e.g., SDUMLA, MMCBNU_6000, and FV-USM).
Read full abstract