Abstract

Online signature verification has a great challenge due to the poor performance of deep learning techniques on cross-lingual datasets under privacy constraints. In this paper, we propose a novel Federated Bert Network (FBN) by embedding the Bidirectional Encoder Representations from Transformers (Bert) into a Federated Learning (FL) framework with client-server architecture. A new Length Alignment Algorithm is employed to unify the signature pairs’ sequence length, and the input representations are fed into the different clients to complete the independent learning of local-models. In addition, the server (coordinator) uses the improved Federated Average Algorithm with Reward-Punishment Mechanism (FedAvgRP) to aggregate these local-models and further generate a global-model. After multiple iterations, the optimal model can be obtained and cross-tested on four datasets (SVC 2004, MCYT-330, BioecurID, and Ours) with skilled forged (random forged) EERs of 7.65% (4.76%), 10.73% (8.46%), 10.09% (7.13%), and 8.28% (5.74%), respectively, far higher than that of the independent learning of state-of-the-art methods. Compared with the domain adaptation and improved FL models, our FBN model performs best in random and skilled forgery scenarios. Moreover, the FedAvgRP algorithm helps our model maintain high performance in the face of data attacks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call