Abstract
In CHES 2017, Jean et al. presented a paper on “Bit-Sliding” in which the authors proposed lightweight constructions for SPN based block ciphers like AES, PRESENT and SKINNY. The main idea behind these constructions was to reduce the length of the datapath to 1 bit and to reformulate the linear layer for these ciphers so that they require fewer scan flip-flops (which have built-in multiplexer functionality and so larger in area as compared to a simple flip-flop). In this paper, we develop their idea even further in few separate directions.First, we prove that given an arbitrary linear transformation, it is always possible to construct the linear layer using merely 2 scan flip-flops. This points to an optimistic venue to follow to gain further GE reductions, yet the straightforward application of the techniques in our proof to PRESENT and GIFT leads to inefficient implementations of the linear layer, as reducing ourselves to 2 scan flip-flops setting requires thousands of clock cycles and leads to very high latency.Equipped with the well-established formalism on permutation groups, we explore whether we can reduce the number of clock cycles to a practical level, i.e. few hundreds, by adding few more pairs of scan flip flops. For PRESENT, we show that 4 (resp. 8, 12) scan flip-flops are sufficient to complete the permutation layer in 384 (resp. 256, 128) clock cycles. For GIFT, we show that 4 (resp. 8, 10) scan flip flops correspond to 320 (resp. 192, 128) clock cycles. Finally, in order to provide the best of the two worlds (i.e. circuit area and latency), we push our scan flip-flop choices even further to completely eliminate the latency incurred by the permutation layer, without compromising our stringent GE budget. We show that not only 12 scan flip flops are sufficient to execute PRESENT permutation in 64 clock cycles, but also the same scan flip flops can be used readily in a combined encryption decryption circuit. Our final design of PRESENT and GIFT beat the record of Jean et al. and Banik et al. in both latency and in circuit-size metric. We believe that the techniques presented in our work can also be used at choosing bit-sliding-friendly linear layer permutations for the future SPN-based designs.
Highlights
The block cipher family Katan [CDK09] and later Simon [BSS+] were in some sense aimed to achieve a lower limit of lightweight encryption in terms of area occupied in silicon
GIFT was a block cipher designed by Banik et al [BPP+17] and presented at CHES 2017, with a view to strengthen the cryptographic properties of PRESENT by redesigning the permutation layer and keyschedule
Since the structure of GIFT is similar to PRESENT, the circuit for the datapath is exactly the same as in Figure 3, with the obvious exception that the scan flip-flop is used in the 60th instead of the 61st location
Summary
The block cipher family Katan [CDK09] (whose precursor was the stream cipher Trivium [CP08]) and later Simon [BSS+] were in some sense aimed to achieve a lower limit of lightweight encryption in terms of area occupied in silicon. This makes sense because more scan flip-flops allow us to execute more transposition operations on the state register in a single clock cycle and it could reduce the total number of cycles to implement the bit permutation layer This could lead to a much smaller size of control bits required to control swaps and keep the area to a minimum. As a result of the theoretical foundations built in the paper, we construct lightweight implementations of the PRESENT and GIFT [BPP+17] circuits for both encryption (E) and combined encryption+decryption (ED) modes Both PRESENT and GIFT are block ciphers in which the linear layer is composed with a bit permutation over the internal state.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.