Abstract

Previous researchers have explored various approaches for predicting the gender of a person based on the features of the iris texture. This paper is the first to predict gender directly from the same binary iris code that could be used for recognition. We found that the information for gender prediction is distributed across the iris, rather than localized in particular concentric bands. We also found that using selected features representing a subset of the iris region achieves better accuracy than using features representing the whole iris region. We used the measures of mutual information to guide the selection of bits from the iris code to use as features in gender prediction. Using this approach, with a person-disjoint training and testing evaluation, we were able to achieve 89% correct gender prediction using the fusion of the best features of iris code from the left and right eyes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call