International Association for Cryptologic Research

IACR News Central

Get an update on changes of the IACR web-page here. For questions, contact newsletter (at) iacr.org. You can also receive updates via:

To receive your credentials via mail again, please click here.

You can also access the full news archive.

Further sources to find out about changes are CryptoDB, ePrint RSS, ePrint Web, Event calender (iCal).

2015-04-11
03:17 [Pub][ePrint] Certificate-Based Encryption Resilient to Key Leakage, by Qihong Yu and Jiguo Li and Yichen Zhang and Wei Wu and Xinyi Huang and Yang Xiang

  Certificate-based encryption (CBE) is an important class of public key encryption but the existing schemes are secure only under the premise that the decryption key (or private key) and master private key are absolutely secret. In fact, a lot of side channel attacks and cold boot attacks can leak secret information of a cryptographic system. In this case, the security of the cryptographic system is destroyed, so a new model called leakage-resilient (LR) cryptography is introduced to solve this problem. While some traditional public key encryption and identity-based encryption with resilient-leakage schemes have been constructed, as far as we know, there is no leakage-resilient scheme in certificate-based cryptosystems. This paper puts forward the first certificate-based encryption scheme which can resist not only the decryption key leakage but also the master secret key leakage. Based on composite order bilinear group assumption, the security of the scheme is proved by using dual system encryption. The relative leakage rate of key is close to 1/3.



03:17 [Pub][ePrint] Query-Complexity Amplification for Random Oracles, by Grégory Demay and Peter Gaži and Ueli Maurer and Björn Tackmann

  Increasing the computational complexity of evaluating a hash

function, both for the honest users as well as for an

adversary, is a useful technique employed for example in

password-based cryptographic schemes to impede brute-force

attacks, and also in so-called proofs of work (used in

protocols like Bitcoin) to show that a certain amount of

computation was performed by a legitimate user. A natural

approach to adjust the complexity of a hash function is to

iterate it $c$~times, for some parameter

$c$, in the hope that any query to the scheme

requires $c$ evaluations of the underlying

hash function. However, results by Dodis et al. (Crypto

2012) imply that plain iteration falls short of achieving

this goal, and designing schemes which provably have such a

desirable property remained an open problem.

This paper formalizes explicitly what it means for a given

scheme to amplify the query complexity of a hash function.

In the random oracle model, the goal of a secure

query-complexity amplifier (QCA) scheme is captured as

transforming, in the sense of indifferentiability, a random

oracle allowing $R$ queries (for the adversary)

into one provably allowing only $r < R$

queries. Turned around, this means that making

$r$ queries to the scheme requires at least

$R$ queries to the actual random oracle. Second,

a new scheme, called collision-free iteration, is proposed and

proven to achieve $c$-fold QCA for both the

honest parties and the adversary, for any fixed

parameter~$c$.





2015-04-07
15:10 [Event][New] STM 2015: 11th International Workshop on Security and Trust Management

  Submission: 16 June 2015
Notification: 23 July 2015
From September 21 to September 22
Location: Vienna, Austria
More Information: http://stm2015.di.unimi.it/


15:10 [Event][New] SECRYPT'15: 12th International Conference on Security and Cryptography

  Submission: 15 April 2015
Notification: 19 May 2015
From July 20 to July 22
Location: Colmar, Alsace, France
More Information: http://www.secrypt.icete.org/


13:08 [PhD][Update] Pablo Rauzy: Formal Software Methods for Cryptosystems Implementation Security

  Name: Pablo Rauzy
Topic: Formal Software Methods for Cryptosystems Implementation Security
Category:implementation

Description:

Implementations of cryptosystems are vulnerable to physical attacks, and thus need to be protected against them. Of course, malfunctioning protections are useless. Formal methods help to develop systems while assessing their conformity to a rigorous specification. The first goal of my thesis, and its innovative aspect, is to show that formal methods can be used to prove not only the principle of the countermeasures according to a model, but also their implementation, as it is very where the physical vulnerabilities are exploited. My second goal is the proof and the automation of the protection techniques themselves, because handwritten security code is error-prone.

Physical attacks can be classified into two distinct categories. Passive attacks, where the attacker only reads information that leaks through side-channels. And active attacks, where the attacker tampers with the system to have it reveal secrets through its ``normal'' output channel. Therefore, I have pursued my goals in both settings: on a countermeasure that diminishes side-channel leakage (such as power consumption or electromagnetic emanations), and on countermeasures against fault injection attacks.

As there already exists a rigorous security property for protections against side-channel leakage, my contributions concentrate on formal methods for design and verification of protected implementations of algorithms. I have developed a methodology to protect an implementation by generating an improved version of it which has a null side-channel signal-to-noise ratio, as its leakage is made constant (in particular, it does not depend on the secret values). For the sake of demonstration, I have also undertaken to write a tool which automates the application of the methodology on an insecure input code written in assembly language. Independently, the tool is able to prove that this constant leakage property holds for a given implementation, which can be use[...]


12:58 [PhD][New] Hadi Soleimany: Studies in Lightweight Cryptography

  Name: Hadi Soleimany
Topic: Studies in Lightweight Cryptography
Category: secret-key cryptography

Description: The decreasing size of devices is one of the most significant changes in telecommunication and information technologies. This change has been accompanied by a dramatic reduction in the cost of computing devices. The dawning era of ubiquitous computing has opened the door to extensive new applications. Ubiquitous computing has found its way into products thanks to the improvements in the underlying enabling technologies. Considerable developments in constraint devices such as RFID tags facilitate novel services and bring embedded computing devices to our everyday environments. The changes that lie ahead will eventually make pervasive computing devices an integral part of our world.\r\nThe growing prevalence of pervasive computing devices has created a significant need for the consideration of security issues. However, security cannot be considered independently, but instead, should be evaluated alongside related issues such as performance and cost. In particular, there are several limitations facing the design of appropriate ciphers for extremely constrained environments. In response to this challenge, several lightweight ciphers have been designed during the last years. The purpose of this dissertation is to evaluate the security of the emerging lightweight block ciphers.\r\n\r\nThis dissertation develops cryptanalytic methods for determining the exact security level of some inventive and unconventional lightweight block ciphers. The work studies zerocorrelation linear cryptanalysis by introducing the Matrix method to facilitate the finding of zero-correlation linear approximations. As applications, we perform zero-correlation cryptanalysis on the 22-round LBlock and TWINE. We also perform simulations on a small variant of LBlock and present the first experimental results to support the theoretical model of the multidimensional zero-correlation linear cryptanalysis method. In addition, we provide a new perspective on slide cryptanalysis and reflection cryptanalysis [...]


12:57 [PhD][Update] Filipe Beato: Private Information Sharing in Online communities

  Name: Filipe Beato
Topic: Private Information Sharing in Online communities
Category:cryptographic protocols



12:56 [PhD][New] Juraj Šarinay: Cryptographic Hash Functions in Groups and Provable Properties

  Name: Juraj Šarinay
Topic: Cryptographic Hash Functions in Groups and Provable Properties
Category: (no category)

Description:

We consider several “provably secure” hash functions that compute simple sums in a well chosen group (G, ?). Security properties of such functions provably translate in a natural way to computational problems in G\r\nthat are simple to define and possibly also hard to solve. Given k disjoint lists Li of group elements, the k-sum problem asks for gi ? Li such that\r\ng1 ? g2 ? . . . ? gk = 1G. Hardness of the problem in the respective groups follows from some “standard” assumptions used in public-key cryptology such\r\nas hardness of integer factoring, discrete logarithms, lattice reduction and syndrome decoding. We point out evidence that the k-sum problem may even be harder than the above problems.

\r\n\r\n\r\n

Two hash functions based on the group k-sum problem, SWIFFTX and FSB, were submitted to NIST as candidates for the future SHA-3 standard. Both submissions were supported by some sort of a security proof. We show\r\nthat the assessment of security levels provided in the proposals is not related to the proofs included. The main claims on security are supported exclusively\r\nby considerations about available attacks. By introducing “second-order” bounds on bounds on security, we expose the limits of such an approach to\r\nprovable security.

\r\n\r\n

A problem with the way security is quantified does not necessarily mean a problem with security itself. Although FSB does have a history of failures,\r\nrecent versions of the two above functions have resisted cryptanalytic efforts well. This evidence, as well as the several connections to more standard\r\nproblems, suggests that the k-sum problem in some groups may be considered hard on its own and possibly lead to provable bounds on security. Complexity of the non-trivial tree algorithm is becoming a standard tool for measuring the associated hardness.

\r\n\r\n\r\n

We propose modifications to the multiplicative Very Smooth Hash and derive security from multiplicative k-sums in contra[...]


00:17 [Pub][ePrint] Cryptanalysis of GGH Map, by Yupu Hu and Huiwen Jia

  Multilinear map is a novel primitive which has many cryptographic applications, and GGH map is a major candidate of multilinear maps. GGH map has two classes of applications, which are respectively applications for public tools of encoding and hidden tools of encoding. In this paper we show that applications of GGH map for public tools of encoding are not secure. We present an efficient attack on GGH map, aiming at multi-party key exchange (MPKE) and the instance of witness encryption (WE) based on the hardness of 3-exact cover problem. First, for the secret of each user, we obtain an equivalent secret, which is the sum of original secret and a noise. The noise is an element of the specific principal ideal, but its size is not small. To do so, we use weak-DL attack presented by authors of GGH map. Second, we use special modular operations, which we call modified encoding/decoding, to filter the decoded noise much smaller. Such filtering is enough to break MPKE. Moreover, such filtering negates K-GMDDH assumption, which is the security basis of an ABE. The procedure almost breaks away from those lattice attacks and looks like an ordinary algebra. The key point is our special tools for modular operations. Finally, we break the instance of WE based on the hardness of 3-exact cover problem. To do so, we not only use modified encoding/decoding, but also (1) introduce and solve \"combined 3-exact cover problem\", which is a problem never hard to be solved; and (2) compute Hermite normal form of the specific principal ideal. The attack on the instance of WE is under an assumption, which seems at least nonnegligible.



00:17 [Pub][ePrint] Boosting OMD for Almost Free Authentication of Associated Data, by Reza Reyhanitabar and Serge Vaudenay and Damian Vizár

  We propose \\emph{pure} OMD (p-OMD) as a new variant of the Offset Merkle-Damg{\\aa}rd (OMD) authenticated encryption scheme. Our new scheme inherits all desirable security features of OMD while having a more compact structure and providing higher efficiency. The original OMD scheme, as submitted to the CAESAR competition, couples a single pass of a variant of the Merkle-Damg{\\aa}rd (MD) iteration with the counter-based XOR MAC algorithm to provide privacy and authenticity. Our improved p-OMD scheme dispenses with the XOR MAC algorithm and is \\emph{purely} based on the MD iteration; hence, the name ``pure\'\' OMD. To process a message of $\\ell$ blocks and associated data of $a$ blocks, OMD needs $\\ell+a+2$ calls to the compression function while p-OMD only requires $\\max\\left\\{\\ell, a\\right\\}+2$ calls. Therefore, for a typical case where $\\ell \\geq a$, p-OMD makes just $\\ell+2$ calls to the compression function; that is, associated data is processed almost freely compared to OMD. We prove the security of p-OMD under the same standard assumption (pseudo-randomness of the compression function) as made in OMD; moreover, the security bound for p-OMD is the same as that of OMD, showing that the modifications made to boost the performance are without any loss of security.



00:17 [Pub][ePrint] The Design Space of Lightweight Cryptography, by Nicky Mouha

  For constrained devices, standard cryptographic algorithms can be too big, too slow or too energy-consuming. The area of lightweight cryptography studies new algorithms to overcome these problems. In this paper, we will focus on symmetric-key encryption, authentication and hashing. Instead of providing a full overview of this area of research, we will highlight three interesting topics. Firstly, we will explore the generic security of lightweight constructions. In particular, we will discuss considerations for key, block and tag sizes, and explore the topic of instantiating a pseudorandom permutation (PRP) with a non-ideal block cipher construction. This is inspired by the increasing prevalence of lightweight designs that are not secure against related-key attacks, such as PRINCE, PRIDE or Chaskey. Secondly, we explore the efficiency of cryptographic primitives. In particular, we investigate the impact on efficiency when the input size of a primitive doubles. Lastly, we provide some considerations for cryptographic design. We observe that applications do not always use cryptographic algorithms as they were intended, which negatively impacts the security and/or efficiency of the resulting implementations.