Abstract

This work explores human intervention to improve Automatic Signature Verification (ASV). Significant efforts have been made in order to improve the performance of ASV algorithms over the last decades. This work analyzes how human actions can be used to complement automatic systems. Which actions to take and to what extent those actions can help state-of-the-art ASV systems is the final aim of this research line. The analysis at classification level comprises experiments with responses from 500 people based on crowdsourcing signature authentication tasks. The results allow to establish a human baseline performance and comparison with automatic systems. Intervention at feature extraction level is evaluated using a self-developed tool for the manual annotation of signature attributes inspired in Forensic Document Experts analysis. We analyze the performance of attribute-based human signature authentication and its complementarity with automatic systems. The experiments are carried out over a public database including the two most popular signature authentication scenarios based on both online (dynamic time sequences including position and pressure) and offline (static images) information. The results demonstrate the potential of human interventions at feature extraction level (by manually annotating signature attributes) and encourage to further research in its capabilities to improve the performance of ASV.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call