International Association for Cryptologic Research

IACR News Central

Get an update on changes of the IACR web-page here. For questions, contact newsletter (at) You can also receive updates via:

To receive your credentials via mail again, please click here.

You can also access the full news archive.

Further sources to find out about changes are CryptoDB, ePrint RSS, ePrint Web, Event calender (iCal).

19:17 [Pub][ePrint]


19:17 [Pub][ePrint] Side-Channel Leakage through Static Power -Should We Care about in Practice?-, by Amir Moradi

  By shrinking the technology static power consumption of CMOS circuits is becoming a major concern. In this paper, we present the first practical results of exploiting static power consumption of FPGA-based cryptographic devices in order to mount a key-recovery side-channel attack. The experiments represented here are based on three Xilinx FPGAs built on 65nm, 45nm, and 28nm process technologies. By means of a sophisticated measurement setup as well as a measurement methodology we demonstrate an exploitable information leakage through static power of the underlying FPGAs. The current work highlights the feasibility of side-channel analysis attacks by static power that have been known for years but have not been performed and investigated in practice yet. This is a starting point for further research investigations, and may have a huge impact on the efficiency of DPA countermeasures in the near future.

22:00 [PhD][Update] Kwangsu Lee: Efficient Hidden Vector Encryptions and Its Applications

  Name: Kwangsu Lee
Topic: Efficient Hidden Vector Encryptions and Its Applications
Category:public-key cryptography


Predicate encryption is a new paradigm of public key encryption that enables searches on encrypted data. Using the predicate encryption, we can search keywords or attributes on encrypted data without decrypting the ciphertexts. In predicate encryption, a ciphertext is associated with attributes and a token corresponds to a predicate. The token that corresponds to a predicate $f$ can decrypt the ciphertext associated with attributes $\vect{x}$ if and only if $f(\vect{x})=1$.

Hidden vector encryption (HVE) is a special kind of predicate encryption. HVE supports the evaluation of conjunctive equality, comparison, and subset operations between attributes in ciphertexts and attributes in tokens. Currently, several HVE schemes were proposed where the ciphertext size, the token size, and the decryption cost are proportional to the number of attributes in the ciphertext. In this thesis, we consider the efficiency, the generality, and the security of HVE schemes. The results of this thesis are described as follows.

The first results of this thesis are efficient HVE schemes where the token consists of just four group elements and the decryption only requires four bilinear map computations, independent of the number of attributes in the ciphertext. The construction uses composite order bilinear groups and is selectively secure under the well-known assumptions.

The second results are efficient HVE schemes that are secure under any kind of pairing types. To achieve our goals, we proposed a general framework that converts HVE schemes from composite order bilinear groups to prime order bilinear groups. Using the framework, we convert the previous HVE schemes from composite order bilinear groups to prime order bilinear groups.

The third results are fully secure HVE schemes with short tokens. Previous HVE schemes were proven to be secure only in the selective security model where the capabilities of the adversaries are severely restricted[...]

10:17 [Pub][ePrint] Construction of New Families of ‎MDS‎ Diffusion Layers, by S. M. Dehnavi and A. Mahmoodi Rishakani and M. R. Mirzaee Shamsabad and Hamidreza Maimani and Einollah Pasha

  Diffusion layers are crucial components of symmetric ciphers‎. ‎These components‎, ‎along with suitable Sboxes‎, ‎can make symmetric ciphers resistant against statistical attacks like linear and differential cryptanalysis‎. ‎Conventional ‎‎MDS diffusion layers, which are defined as matrices over finite fields, have been used in symmetric ciphers such as AES‎, ‎Twofish and SNOW‎. ‎In this paper‎, ‎we study linear, linearized and nonlinear MDS diffusion layers‎. We investigate linearized diffusion layers, ‎which are a generalization of conventional diffusion layers‎; t‎hese diffusion layers are used in symmetric ciphers like SMS4‎, ‎Loiss and ZUC‎. W‎e introduce some ‎new ‎families of linearized MDS diffusion layers ‎and as a consequence, ‎we ‎present a‎ ‎method ‎for ‎construction of ‎‎‎‎randomized linear ‎‎‎‎‎diffusion ‎layers over a finite field. Nonlinear MDS diffusion layers are introduced in Klimov\'s thesis; we investigate nonlinear MDS diffusion layers theoretically, and we present a new family of nonlinear MDS diffusion layers. We show that these nonlinear diffusion layers can be made randomized with a low ‎implementatio‎n cost. An important fact about linearized and nonlinear diffusion layers is that they are more resistant against algebraic attacks in comparison to conventional diffusion layers. A ‎special case of diffusion layers are ‎‎‎(0,1)‎-‎diffusion layers. This type of diffusion layers are used in symmetric ciphers like ARIA‎. ‎W‎e examine (0,1)‎-‎diffusion layers and prove a theorem about them‎. ‎At last‎, ‎we study linearized MDS diffusion layers of symmetric ciphers Loiss, SMS4 and ZUC‎, from the mathematical viewpoint.

10:17 [Pub][ePrint]


10:17 [Pub][ePrint] A Novel Modular Adder for One Thousand Bits and More Using Fast Carry Chains of Modern FPGAs, by Marcin Rogawski, Kris Gaj and Ekawat Homsirikamol

  In this paper a novel, low latency family of adders

and modular adders has been proposed. This family efficiently

combines the ideas of high-radix carry-save addition and the parallel

prefix networks. It also takes advantage of fast carry chains

of modern FPGAs. The implementation results reveal that these

hybrid adders have great potential for efficient implementation

of modular addition of long integers used in various public key

cryptography schemes.

10:17 [Pub][ePrint] Linkable Message Tagging: Solving the key distribution problem of signature schemes, by Felix G√ľnther and Bertram Poettering

  Digital signatures are one of the most extensively used cryptographic primitives today. It is well-understood that they guarantee practical security only if the corresponding verification keys are distributed authentically; however, arguably, satisfying solutions for the latter haven\'t been found yet, or at least aren\'t in large-scale deployment. This paper introduces a novel approach for cryptographic message authentication where this problem does not arise: A linkable message tagging scheme (LMT) identifies pairs of messages and accompanying authentication tags as related if and only if these tags were created using the same secret key. In other words, in contrast to signature schemes, our primitive does not aim at detecting whether individually considered messages originate from an explicitly specified entity, but instead decides whether all messages from a given collection originate from the same (possibly anonymous) source. The appealing consequence is that our primitive does not involve public keys at all, and hence elegantly sidesteps the key distribution problem of signature schemes.

As an interesting application of LMT we envision an email authentication system with minimal user interaction. Email clients could routinely generate a secret LMT key upon their first invocation, and then equip all outgoing messages with corresponding tags. On the receiver\'s side, client software could automatically verify whether incoming messages originate from the same entity as previously or subsequently received messages with an (allegedly) identical sender address. Although this form of message authentication does not provide as strong guarantees of sender\'s origin as signature schemes would do, we do believe that trading the apparently discouraging obstacles implied by the authentic distribution of signature verification keys for the assumption that an attacker does not forge every message exchanged between parties is quite attractive.

On the technical side, we formalize the notions of LMT and its (more efficient) variant CMT (classifiable message tagging), including corresponding notions of unforgeability. For both variants we propose a range of provably secure constructions, basing on different hardness assumptions, with and without requiring random oracles.

10:17 [Pub][ePrint] Tight Security Bounds for Triple Encryption, by Jooyoung Lee

  In this paper, we revisit the old problem asking the exact provable security of triple encryption in the ideal cipher model. For a blockcipher with key length k and block size n, triple encryption is known to be secure up to 2^{k+min{k/2,n/2}} queries, while the best attack requires 2^{k+min{k,n/2}} query complexity. So there is a gap between the upper and lower bounds for the security of triple encryption. We close this gap by proving the security up to 2^{k+min{k,n/2}} query complexity. With the DES parameters, triple encryption is secure up to 2^{82.5} queries, greater than the current bound of 2^{78.3} and comparable to 2^{83.5} for 2-XOR-cascade.

We also analyze the security of two-key triple encryption, where the first and the third keys are identical. We prove that two-key triple encryption is secure up to 2^{k+min{k,n/2}} queries to the underlying blockcipher and 2^{min{k,n/2}} queries to the outer permutation. For the DES parameters, this result is interpreted as the security of two-key triple encryption up to 2^{32} plaintext-ciphertext pairs and 2^{81.7} blockcipher encryptions.

10:17 [Pub][ePrint] Triple and Quadruple Encryption: Bridging the Gaps, by Bart Mennink and Bart Preneel

  Triple encryption is a cascade of three block cipher evaluations with independent keys, in order to enlarge its key size. This design is proven secure up to approximately 2^{kappa+min{kappa/2,n/2}} queries (by Bellare and Rogaway, EUROCRYPT 2006, and Ga\\v{z}i and Maurer, ASIACRYPT 2009), where kappa denotes the key size and n the block length of the underlying block cipher. On the other hand, the best known attack requires about 2^{kappa+n/2} queries (by Lucks, FSE 1998, and Ga\\v{z}i, CRYPTO 2013). These bounds are non-tight for kappa

10:17 [Pub][ePrint] Two-round password-only authenticated key exchange in the three-party setting, by Junghyun Nam and Kim-Kwang Raymond Choo and Juryon Paik and Dongho Won

  We present the first provably-secure 3-party password-only authenticated key exchange (PAKE) protocol that can run in only two communication rounds. Our protocol is generic in the sense that it can be constructed from any 2-party PAKE protocol. The protocol is proven secure in a variant of the widely accepted model of Bellare, Pointcheval and Rogaway (2000) without any idealized assumptions on the cryptographic primitives used. We also investigate the security of the 2-round 3-party PAKE protocol of Wang, Hu and Li (2010), and demonstrate that this protocol cannot achieve implicit key authentication in the presence of an active adversary.

10:17 [Pub][ePrint] Completeness for Symmetric Two-Party Functionalities - Revisited, by Yehuda Lindell and Eran Omri and Hila Zarosim

  Understanding the minimal assumptions required for carrying out cryptographic tasks is one of the fundamental goals of theoretical cryptography. A rich body of work has been dedicated to understanding the complexity of cryptographic tasks in the context of (semi-honest) secure two-party computation. Much of this work has focused on the characterization of trivial and complete functionalities (resp., functionalities that can be securely implemented unconditionally, and functionalities that can be used to securely compute all functionalities).

All previous works define reductions via an ideal implementation of the functionality; \\ie $f$ reduces to $g$ if one can implement $f$ using an ideal box (or oracle) that computes the function $g$ and returns the output to both parties. Such a reduction models the computation of $f$ as an \\emph{atomic operation}. However, in the real-world, protocols proceed in rounds, and the output is not learned by the parties simultaneously. In this paper we show that this distinction is significant. Specifically, we show that there exist symmetric functionalities (where both parties receive the same outcome), that are neither trivial nor complete under ``ideal-box reductions\'\', and yet the existence of a constant-round protocol for securely computing such a functionality implies infinitely-often oblivious transfer (meaning that it is secure for infinitely-many $n$\'s).

In light of the above, we propose an alternative definitional infrastructure for studying the triviality and completeness of functionalities.