International Association for Cryptologic Research

IACR News Central

Get an update on changes of the IACR web-page here. For questions, contact newsletter (at) You can also receive updates via:

To receive your credentials via mail again, please click here.

You can also access the full news archive.

Further sources to find out about changes are CryptoDB, ePrint RSS, ePrint Web, Event calender (iCal).

04:17 [Pub][ePrint] Nonuniform Indistinguishability and Unpredictability Hardcore Lemmas: New Proofs and Applications to Pseudoentropy, by Maciej Skorski

  Hardcore lemmas are results in complexity theory which state that average-case hardness must have a very hard ``kernel\'\', that is a subset of instances where the given problem is extremely hard. They find important applications in hardness amplification. In this paper we revisit the following two fundamental results:


\\item The hardcore lemma for unpredictability, due to Impagliazzo (FOCS \'95). It states that if a boolean function $f$ is ``moderately\'\' hard to predict on average, then there must be a set of noticeable size on which $f$ is ``extremely\'\' hard to predict.

\\item The hardcore lemma for indistinguishability, proved by Maurer and Tessaro (TCC\'10), states that for two random variables $X$ and $Y$ which are $\\epsilon$-computationally close, there are events $A$ and $B$ of probability $1-\\epsilon$ such that the distributions of $X|A$ and $Y|B$ are ``computationally\'\' identical.


Using only the standard min-max theorem and some basic facts about convex approximations in $L_p$ spaces, we provide alternative modular proofs and some generalizations of these results in the nonuniform setting, achieving best possible bounds for (a) and slightly improving the known bounds for (b). As an interesting application, we show a strengthening of the transformation between two most popular pseudoentropy variants: HILL and Metric Entropy, and apply it to show how to extract pseudorandomness from a sequence of metric-entropy sources of poor quality. In this case we significantly improve security parameters, comparing to the best known techniques.

04:17 [Pub][ePrint] Constructing and Understanding Chosen Ciphertext Security via Puncturable Key Encapsulation Mechanisms, by Takahiro Matsuda and Goichiro Hanaoka

  In this paper, we introduce and study a new cryptographic primitive that we call \"puncturable key encapsulation mechanism\" (PKEM), which is a special class of KEMs that satisfy some functional and security requirements that, combined together, imply chosen ciphertext security (CCA security). The purpose of introducing this primitive is to capture certain common patterns in the security proofs of the several existing CCA secure public key encryption (PKE) schemes and KEMs based on general cryptographic primitives which (explicitly or implicitly) use the ideas and techniques of the Dolev-Dwork-Naor (DDN) construction (STOC\'91), and \"break down\" the proofs into smaller steps, so that each small step is easier to work with/verify/understand than directly tackling CCA security.

To see the usefulness of PKEM, we show (1) how several existing constructions of CCA secure PKE/KEM constructed based on general cryptographic primitives can be captured as a PKEM, which enables us to understand these constructions via a unified framework, (2) its connection to detectable CCA security (Hohenberger et al. EUROCRYPT\'12), and (3) a new security proof for a KEM-analogue of the DDN construction from a set of assumptions: \"sender non-committing encryption\" (SNCE) and non-interactive witness indistinguishable proofs.

Then, as our main technical result, we show how to construct a PKEM satisfying our requirements (and thus a CCA secure KEM) from a new set of general cryptographic primitives: \"SNCE\" and \"symmetric key encryption secure for key-dependent messages\" (KDM secure SKE). Our construction realizes the \"decrypt-then-re-encrypt\"-style validity check of a ciphertext which is powerful but in general has a problem of the circularity between a plaintext and a randomness.We show how SNCE and KDM secure SKE can be used together to overcome the circularity. We believe that the connection among three seemingly unrelated notions of encryption primitives, i.e. CCA security, the sender non-committing property, and KDM security, to be of theoretical interest.


  Secure transmission of message was the concern of early men. Several techniques have been developed ever since to assure that the message is understandable only by the sender and the receiver while it would be meaningless to others. In this century, cryptography has gained much significance. This paper proposes a scheme to generate a Dynamic Key-dependent S-Box for the SubBytes Transformation used in Cryptographic Techniques.

22:17 [Pub][ePrint] Generalization of Statistical Criteria for Sboxes, by S. M. Dehnavi and A. Mahmoodi Rishakani and M. R. Mirzaee Shamsabad and Einollah Pasha

  Linear and differential cryptanalysis and their generalizations are the most important tools in ststistical analysis of symmetric ciphers.

These attacks make use of linear and differential properties of Sboxes and component functions of symmetric ciphers. In this

article, we investigate generalized statistical properties for Sboxes. We justify the application of linear, differential and differential-linear

cryptanalysis from the mathematical viewpoint. We verify some well-known Sboxes and vectotial Boolean functions by the proposed

criteria and show that these functions have larger biases compared with previous criteria presentesd up to now.

22:17 [Pub][ePrint] Some New Results on Binary Polynomial Multiplication, by Murat Cenk and M. Anwar Hasan

  This paper presents several methods for reducing the number of bit operations for multiplication of polynomials over the binary field. First, a modified Bernstein\'s 3-way algorithm is introduced, followed by a new 5-way algorithm. Next, a new 3-way algorithm that improves asymptotic arithmetic complexity compared to Bernstein\'s 3-way algorithm is introduced. This new algorithm uses three multiplications of one-third size polynomials over the binary field and one multiplication of one-third size polynomials over the finite field with four elements. Unlike Bernstein\'s algorithm, which has a linear delay complexity with respect to input size, the delay complexity of the new algorithm is logarithmic. The number of bit operations for the multiplication of polynomials over the finite field with four elements is also computed. Finally, all these new results are combined to obtain improved complexities.

22:17 [Pub][ePrint] Rotational Cryptanalysis of ARX Revisited, by Dmitry Khovratovich and Ivica Nikolic and Josef Pieprzyk and Przemyslaw Sokolowski and Ron Steinfeld

  Rotational cryptanalysis is a probabilistic attack applicable to word oriented designs that use (almost) rotation-invariant constants. It is believed that the success probability of rotational cryptanalysis against ciphers and functions based on modular additions, rotations and XORs, can be computed only by counting the number of additions. We show that this simple formula is incorrect due to the invalid Markov cipher assumption used for computing the probability. More precisely, we show that chained modular additions used in ARX ciphers do not form a Markov chain with regards to rotational analysis, thus the rotational probability cannot be computed as a simple product of rotational probabilities of individual modular additions. We provide a precise value of the probability of such chains and give a new algorithm for computing the rotational probability of ARX ciphers. We use the algorithm to correct the rotational attacks on BLAKE2 and to provide valid rotational attacks against the simplified version of Skein.

22:17 [Pub][ePrint] Meet in the Middle Attacks on Reduced Round Kuznyechik, by Riham AlTawy and Amr M. Youssef

  Kuznyechik is an SPN block cipher that has been recently chosen to be standardized by the Russian federation as a new GOST cipher. The algorithm updates a 128-bit state for nine rounds using a 256-bit key. In this paper, we present meet-in-the-middle attacks on the 4 and 5 round reduced cipher. Our attacks are based on the differential enumeration approach, where we propose a distinguisher for the middle rounds and match a sequence of state differences at its output. However, the application of the exact approach is not successful on Kuznyechik due to its optimal round diffusion. Accordingly, we adopt an equivalent representation for the last round where we can efficiently filter ciphertext pairs and launch the attack in the chosen ciphertext setting. We also utilize partial sequence matching which further reduces the memory and time complexities through relaxing the error probability. The adopted partial sequence matching approach enables successful key recovery by matching parts of the generated sequence instead of the full sequence matching used in the traditional settings of this attack. For the 4 and 5 round reduced cipher, the 256-bit master key is recovered with a time complexity of $2^{139.6}$ and $2^{140.3}$, and a memory complexity of $2^{24.6}$ and $2^{153.3}$, respectively. Both attacks have similar data complexity of $2^{113}$

22:17 [Pub][ePrint] Surreptitiously Weakening Cryptographic Systems, by Bruce Schneier and Matthew Fredrikson and Tadayoshi Kohno and Thomas Ristenpart

  Revelations over the past couple of years highlight the importance of understanding malicious and surreptitious weakening of cryptographic systems. We provide an overview of this domain, using a number of historical examples to drive development of a weaknesses taxonomy. This allows comparing different approaches to sabotage. We categorize a broader set of potential avenues for weakening systems using this taxonomy, and discuss what future research is needed to provide sabotage-resilient cryptography.

22:17 [Pub][ePrint] Adaptive-ID Secure Revocable Identity-Based Encryption from Lattices via Subset Difference Method, by Shantian Cheng and Juanyang Zhang

  In view of the expiration or reveal of user\'s private credential (or private key) in a realistic scenario, identity-based encryption (IBE) schemes with an efficient key revocation mechanism, or for short, revocable identity-based encryption (RIBE) schemes, become prominently significant. In this paper, we present an RIBE scheme from lattices by combining two Agrawal et al.\'s IBE schemes with the subset difference (SD) method. Our scheme is secure against adaptive identity-time attacks in the standard model under the learning with errors (LWE) assumption. In particular, our scheme serves as one solution to the challenge posed by Chen et al.(ACISP \'12).

22:17 [Pub][ePrint] Universally Composable Firewall Architectures using Trusted Hardware, by Dirk Achenbach and Jörn Müller-Quade and Jochen Rill

  Network firewalls are a standard security measure in computer networks that connect to the Internet. Often, ready-to-use firewall appliances are trusted to protect the network from malicious Internet traffic. However, because of their black-box nature, no one can be sure of their exact functionality. We address the possibility of actively compromised firewalls. That is, we consider the possibility that a network firewall might collaborate with an outside adversary to attack the network. To alleviate this threat, we suggest composing multiple firewalls from different suppliers to obtain a secure firewall architecture. We rigorously treat the composition of potentially malicious network firewalls in a formal model based on the Universal Composability framework. Our security assumption is trusted hardware. We show that a serial concatenation of firewalls is insecure even when trusted hardware ensures that no new packages are generated by the compromised firewall. Further, we show that the parallel composition of two firewalls is only secure when the order of packets is not considered. We prove that the parallel composition of three firewalls is insecure, unless a modified trusted hardware is used.

22:17 [Pub][ePrint] Influence of Electrical Circuits of ECC Designs on Shape of Electromagnetic Traces measured on FPGA, by Christian Wittke and Zoya Dyka and Peter Langendoerfer

  Side channel attacks take advantage from the fact that the behavior of crypto implementations can be observed and provides hints that simplify revealing keys. The energy consumption of the chip that performs a cryptographic operation depends on its inputs, on the used cryptographic key and on the circuit that realizes the cryptographic algorithm. An attacker can experiment with different inputs and key candidates: he studies the influence of these parameters on the shape of measured traces with the goal to extract the key. The main assumption is here that the circuit of the attacked devices is constant. In this paper we investigated the influence of variable circuits on the shape of electromagnetic traces. We changed only a part of the cryptographic designs i.e. the partial multiplier of our ECC designs. This part calculates always the same function in a single clock cycle. The rest of the design was kept unchanged. So, we obtained designs with significantly different circuits: in our experiments the number of used FPGAs LUTs differs up to 15%. These differences in the circuits caused a big difference in the shape of electromagnetic traces even when the same data and the same key are processed. Our experiments show that the influence of different circuits on the shape of traces is comparable with the influence of different inputs. We assume that this fact can be used as a protection means against side channel attacks, especially if the cryptographic circuit can be changed before the cryptographic operation is executed or dynamically, i.e. while the cryptographic operation is processed.