Saliency-Guided Textured Contact Lens-Aware Iris Recognition

Iris recognition requires an adequate level of the iris texture being visible to perform a reliable matching. In case when a textured contact lens covers the iris, a false non-match is reported or a presentation attack is detected. There are, however, scenarios in which one wants to maximize the probability of a correct match despite the iris texture being being partially or mostly obscured, for instance when a non-cooperative subject conceals their identity by purposely wearing textured contact lenses. This paper proposes an iris recognition method designed to detect and match portions of live iris tissue still visible when a person wears textured contact lenses. The proposed method includes (a) a convolutional neural network-based segmenter detecting partial live iris patterns, and (b) a Siamese network-based feature extraction model, trained in a novel way with images having non-iris information removed by blurring, to guide the network towards salient live iris features. Experiments matching pairs of iris images in which the iris is not wearing a lens in one image and is wearing a textured contact lens in the other, show a lower EER= 10.6% for the proposed algorithm, compared to state-of-the-art iris code-based iris recognition (EER= 33.6%). The source codes of the method are offered along with the paper.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here