International Association for Cryptologic Research

# IACR News Central

You can also access the full news archive.

Further sources to find out about changes are CryptoDB, ePrint RSS, ePrint Web, Event calender (iCal).

2013-07-29
06:17 [Pub][ePrint]

The equivalence of the random oracle model and the ideal cipher model has been studied in a long series of results. Holenstein, Künzler, and Tessaro (STOC, 2011) have recently completed the picture positively, assuming that, roughly speaking, equivalence is indifferentiability from each other. However, under the stronger notion of reset indifferentiability this picture changes significantly, as Demay et al. (EUROCRYPT, 2013) and Luykx et al. (ePrint, 2012) demonstrate.

We complement these latter works in several ways. First, we show that any simulator satisfying the reset indifferentiability notion must be stateless and pseudo-deterministic. Using this characterization we show that, with respect to reset indifferentiability, two ideal models are either equivalent or incomparable, that is, a model cannot be strictly stronger than the other model. In the case of the random oracle model and the ideal cipher model, this implies that the two are incomparable. Finally, we examine weaker notions of reset indifferentiability that, while not being able to allow composition in general, allow composition for a large class of multi-stage games. Here we show that the seemingly much weaker notion of 1-reset indifferentiability proposed by Luykx et al. is equivalent to reset indifferentiability. Hence, the impossibility of coming up with a reset indifferentiable construction transfers to the setting where only one reset is permitted, thereby re-opening the quest for an achievable and meaningful notion in between the two variants.

06:17 [Pub][ePrint]

In this paper we propose a solution to enable an accurate terminal revocation in the Extended Access Control (EAC). Chaabouni and Vaudenay in [CV09] pointed out the need for an accurate revocation procedure, but failed to provide a complete solution description. We aim at filling this gap. Our solution relies on augmenting terminal authentication with a t-out-of-l threshold signature provided by neighboring terminals. These terminals will be in charge of checking the revocation status of the requested terminal. As Terminals have a real clock embedded and more computational power than Machine Readable Travel Documents (MRTDs), they are better suited for checking revocation status.

2013-07-27
17:40 [Job][New]

The Cryptography group within the Department of Computer Science has grown considerably in the last year and additional researchers are required in the following areas:

- Analysis of “real world” protocols

- Formal Methods applied to security protocols

- Fully Homomorphic Encryption

- Lattice Based Cryptography

- Provable Security, i.e. Protocol and Mechanism design

- Multi-Party Computation

You will hold a PhD, or expect to be awarded soon, and have experience in one of the sub-areas of cryptography mentioned above.

You will have a good level of analytical skills and the ability to communicate complex information clearly, both orally and through the written word together with the ability to use personal initiative, and creativity, to solve problems encountered in the research context.

Ideally, you will also have a strong publication record in top relevant venues, such as the IACR conferences and journal, ACM-CCS, IEEE S&P, ESORICS, etc

Appointment may be made at the Research Assistant (grade I) or Research Associate (grade J) level depending on skills and experience and will be for 2 to 3 years in the first instance.

17:39 [Job][New]

We are looking for a postdoctoral researcher to participate in Project CATREL (theoretical and practical improvements for algorithms for breaking discrete logarithms over finite fields). This two-year position is with the GRACE team at the École polytechnique (in the southern suburbs of Paris), starting no later than January 1, 2014.

http://catrel.loria.fr/

http://catrel.loria.fr/positions.en.html

http://www.lix.polytechnique.fr/cryptologie/

Candidates should have a PhD in number theory or computer science.

Good programming skills and knowledge of number theory are essential; experience in C/C++ development, algorithmic number theory, and computer algebra systems (such as Magma, Sage, Pari-GP, etc) would be an advantage.

03:17 [Pub][ePrint]

Proxy re-encryption (PRE) allows a semi-trusted proxy to convert a ciphertext intended for Alice into a ciphertext for Bob without learning anything about the underlying plaintext. Chunbo Ma et al. have proposed a group based proxy re-encryption scheme to convert a ciphertext from one group to another. Any group member can independently decrypt the ciphertexts encrypted to its group. In their paper, the authors gave a security proof to say that the scheme is secure against adaptive chosen ciphertext attack. However, we highlight the flaws in their scheme and show that their scheme is not secure against adaptive chosen ciphertext attack. In this direction, we construct an adversary who issues only one decryption oracle query and break the security of their scheme with non-negligible advantage.

03:17 [Pub][ePrint]

Most computational soundness theorems deal with a limited number of primitives, thereby limiting their applicability. The notion of deduction soundness of Cortier and Warinschi (CCS\'11) aims to facilitate soundness theorems for richer frameworks via composition results: deduction soundness extends, generically, with asymmetric encryption and public data structures. Unfortunately, that paper also hints at rather serious limitations regarding further composition results: composability with digital signatures seems to be precluded.

In this paper we provide techniques for bypassing the perceived limitations of deduction soundness and demonstrate that it enjoys vastly improved composition properties. More precisely, we show that a deduction sound implementation can be modularly extended with all of the basic cryptographic primitives (symmetric/asymmetric encryption, message authentication codes, digital signatures, and hash functions). We thus obtain the first soundness framework that allows for the joint use of multiple instances of all of the basic primitives.

In addition, we show how to overcome an important restriction of the bare deduction soundness framework which forbids sending encrypted secret keys. In turn, this prevents its use for the analysis of a large class of interesting protocols (e.g. key exchange protocols). We allow for more liberal uses of keys as long as they are hidden in a sense that we also define. All primitives typically used to send secret data (symmetric/asymmetric encryption) satisfy our requirement which we also show to be preserved under composition.

03:17 [Pub][ePrint]

We study exponentiations in pairing groups for the most common security levels and show that, although the Weierstrass model is preferable for pairing computation, it can be worthwhile to map to alternative curve representations for the non-pairing group operations in protocols.

2013-07-23
17:09 [Job][New]

We are looking for an excellent PhD candidate to work in the area of information and communication security with a focus on authentication problems in constrained settings. This is particularly important for applications involving mobile phones, wireless communication and RFID systems, which suffer from restrictions in terms of power resources, network connectivity, computational capabilities, as well as potential privacy issues. The overall aim of the project will be to develop nearly optimal algorithms for achieving security and privacy while minimising resource use.

More concretely, part of the research will involve the analysis and development of authentication protocols in specific settings. This will include investigating resistance of both existing and novel protocols against different types of attacks, theoretically and experimentally. In addition to investigating established settings, such as RFID authentication, the research will also explore more general authentication problems, such as those that arise in the context of trust in social networks, smartphone applications and collaborative data processing. This will be done by grounding the work in a generalised decision-making framework. The project should result in the development of theory and authentication mechanisms for noisy, constrained settings that strike an optimal balance between reliable authentication, privacy-preservation and resource consumption. Some previous research related to this research project can be found here: http://lasecwww.epfl.ch/~katerina/Publications.html

Applicants for the position shall have a Master’s Degree or corresponding in Computer Science, Informatics, Telecommunications or in a related discipline. A master\\\'s degree in information security or cryptography is a bonus.

15:17 [Pub][ePrint]

We introduce a new technique, that we call punctured programs,

to apply indistinguishability obfuscation towards cryptographic

problems. We use this technique to carry out a systematic study of

the applicability of indistinguishability obfuscation to a variety of

cryptographic goals. Along the way, we resolve the 16-year-old open

question of Deniable Encryption, posed by Canetti, Dwork, Naor,

and Ostrovsky in 1997: In deniable encryption, a sender who is forced

to reveal to an adversary both her message and the randomness she used

for encrypting it should be able to convincingly provide fake\'\'

randomness that can explain any alternative message that she would

like to pretend that she sent. We resolve this question by giving the

first construction of deniable encryption that does not require

any pre-planning by the party that must later issue a denial.

In addition, we show the generality of our punctured programs

technique by also constructing a variety of core cryptographic objects

from indistinguishability obfuscation and one-way functions (or close

variants). In particular we obtain: public key encryption, short

hash-and-sign\'\' selectively secure signatures, chosen-ciphertext

secure public key encryption, non-interactive zero knowledge proofs

(NIZKs), injective trapdoor functions, and oblivious transfer. These

results suggest the possibility of indistinguishability

obfuscation becoming a central hub\'\' for cryptography.

15:17 [Pub][ePrint]

The goal of white-box cryptography is to design implementations of common cryptographic algorithm (e.g. AES) that remain secure against an attacker with full control of the implementation and execution environment. This concept was put forward a decade ago by Chow et al. (SAC 2002) who proposed the first white-box implementation of AES. Since then, several works have been dedicated to the design of new implementations and/or the breaking of existing ones.

In this paper, we describe a new attack against the original implementation of Chow et al. (SAC 2002), which efficiently recovers the AES secret key as well as the private external encodings in complexity $2^{22}$. Compared to the previous attack due to Billet et al. (SAC 2004) of complexity $2^{30}$, our attack is not only more efficient but also simpler to implement. Then, we show that the \\emph{last} candidate white-box AES implementation due to Karroumi (ICISC 2010) can be broken by a direct application of either Billet et al. attack or ours. Specifically, we show that for any given secret key, the overall implementation has the \\emph{exact same} distribution as the implementation of Chow et al. making them both vulnerable to the same attacks.

By improving the state of the art of white-box cryptanalysis and putting forward new attack techniques, we believe our work brings new insights on the failure of existing white-box implementations, which could be useful for the design of future solutions.

2013-07-22
15:17 [Forum]

I must admit that I do not find D.J. Bernsteins assessment appropriate, for a number of reasons. First of all, the optical example he gives does not really apply to the work of Rührmair et al. They only discuss modeling attacks on electrical PUFs with digital, one-bit outputs. Such outputs can easily be imitated if an efficient algorithm that predicts the electrical PUF is known. The simulation algorithms which Rührmair et al. derive are, in fact, extremely simple. They often merely require the addition of small integers, followed by a comparison operation. If implemented, their form factor would be quite close to a real PUF. In particular, they could be easily implemented in smart cards and the like, which are one of the standard application examples for PUFs. In order to notice the difference, honest users would have to physically open the PUF/smart card and inspect it with utmost thoroughness, which is in an impractical task for the average user. This makes the simulation algorithms of Rührmair et al an extremely effective way to cheat. Furthermore, even if the outputs of a PUF are analog and complex (as in the case of optical PUFs), a simulation algorithm can be used to break certain protocols. For example, an exchanged key can be derived already from mere numeric simulations of the PUF. The exact analog imitation of the complex optical signal is not necessary to this end. I also cannot share D.J. Bernsteins claim that the authors would exaggerate their results. They give intensive and, in my opinion, well-balanced discussions on the reach, but at the same time on the limitations of their techniques in the introduction section (and partly also in the summary section). For example, they make clear that their attacks apply to so-called Weak PUFs only under very rare circumstances, and that they do not apply at all to Coating PUFs, SRAM PUFs, or Butterfly PUFs. Any readers who had a short glimpse at the introduction cannot miss this discussion, and could hardly make claims on a PUF-exaggeration, as the ones we have seen. Finally, it may be interesting to add that following its publication on the ePrint archive, the paper has been accepted to CCS in 2010. It has been quoted multiple times in the meantime (please see Google scholar). It is clear that such acceptance and citation numbers cannot always prove the quality of scientific work. But in this case, they show at least that the paper has been received rather well in the PUF community, which is somewhat at odds with D.J. Bernsteins (perhaps too harsh) criticism of the paper. GeorgeBest From: 2013-22-07 15:15:59 (UTC)