Abstract

We introduce a debiasing scheme that solves the more noise than entropy problem which can occur in Helper Data Systems when the source is very biased. We perform a condensing step, similar to Index-Based Syndrome coding, that reduces the size of the source space in such a way that some source entropy is lost, while the noise entropy is greatly reduced. In addition, our method allows for even more entropy extraction by means of a ‘spamming’ technique. Our method outperforms solutions based on the one-pass and two-pass von Neumann algorithms.

Highlights

  • 1.1 Helper Data SystemsThe past decade has seen a lot of interest in the field of security with noisy data

  • In both the biometrics and the physical unclonable functions (PUFs)/Physically Obfuscated Keys (POKs) case, one faces the problem that some form of error correction has to be performed, but under the constraint that the redundancy data, which are visible to attackers, do not endanger the secret extracted from the physical measurement

  • We have introduced a method for source debiasing that can be used in Helper Data Systems to solve the ‘more noise than entropy’ problem

Read more

Summary

Helper Data Systems

In several security applications it is necessary to reproducibly extract secret data from noisy measurements on a physical system One such application is read-proof storage of cryptographic keys using physical unclonable functions (PUFs) [5,16,18,19,20]. In both the biometrics and the PUF/POK case, one faces the problem that some form of error correction has to be performed, but under the constraint that the redundancy data, which are visible to attackers, do not endanger the secret extracted from the physical measurement This problem is solved by a special security primitive, the Helper Data System (HDS).

The problem of bias
Contributions and outline
Debiasing based on subset selection
The scheme
Explanation of the scheme
Entropy after condensing
Fuzzy Extraction after condensing
The list size L
Comparison to other debiasing schemes
Some remarks on min-entropy
Summary
A Proof of Theorem 1
Findings
B Proof of Theorem 2
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.