International Association for Cryptologic Research

# IACR News Central

You can also access the full news archive.

Further sources to find out about changes are CryptoDB, ePrint RSS, ePrint Web, Event calender (iCal).

2013-12-06
22:17 [Pub][ePrint]

Public-Key Encryption (PKE) and Message Authentication (PKMA, aka as digital signatures) are fundamental cryptographic primitives. Traditionally, both notions are defined as non-interactive (i.e., single-message). In this work, we initiate rigorous study of (possibly) {\\em interactive} PKE and PKMA schemes. We obtain the following results demonstrating the power of interaction to resolve questions which are either open or impossible in the non-interactive setting.

Efficiency/Assumptions.

One of the most well known open questions in the area of PKE is to build, in a black-box way\'\', so called chosen ciphertext attack (CCA-) secure PKE from chosen plaintext attack (CPA-) secure PKE. In contrast, we show a simple $2$-round CCA-secure PKE from any (non-interactive) CPA-secure PKE (in fact, these primitives turn out to be equivalent). Similarly, although non-interactive PKMA schemes can be inefficiently built from any one-way function, no efficient signature schemes are known from many popular number-theoretic assumptions, such as factoring, CDH or DDH. In contrast, we show an efficient $2$-round PKMA from most popular assumptions, including factoring, CDH and DDH.

It is well known that no non-interactive signature (resp. encryption) scheme can be {\\em deniable} (resp. {\\em forward-secure}), since the signature (resp. ciphertext) can later serve as an evidence of the sender\'s consent\'\' (resp. be decrypted if the receiver\'s key is compromised\'\'). We also formalize a related notion of {\\em replay-secure} (necessarily) interactive PKMA (resp. PKE) schemes, where the verifier (resp. encryptor) is assured that the current\'\' message can only be authenticated (resp. decrypted) by the secret key owner {\\em now}, as opposed to some time in the past (resp. future). We observe that our 2-round PKMA scheme is both replay-secure and (passively) deniable, and our 2-round PKE scheme is both replay- and forward-secure. We also define and construct stronger forms of necessarily interactive PKE/PKMA schemes, called {\\em confirmed encryption} and {\\em confidential authentication}.

Anonymous Key Exchange.

We extend our definitional framework for interactive PKE and PKMA schemes to give definitions and constructions of (necessarily interactive) {\\em anonymous key exchange} (1-KE), where an anonymous (unkeyed) party establishes a key with an authenticated (keyed) party. Unlike the prior work, defining 1-KE by downgrading\'\' the hairy and complex definition of {\\em mutually authenticated} key exchange (2-KE), our definition is very short\'\' and easy to understand. We also show simple and general connections between anonymous KE and (interactive) confirmed PKE/confidential PKMA schemes. As a result, we obtain old and new schemes for anonymous KE in a clean and modular manner. For example, we obtain the first $2$-round anonymous KE which is both (passively) deniable and forward-secure.

22:17 [Pub][ePrint]

Visual cryptography is a special type of secret sharing. Two models of visual cryptography have been independently studied: deterministic visual cryptography, introduced by Naor and Shamir, and random grid visual cryptography, introduced by Kafri and Keren. In the context of the deterministic model, Yang has introduced a third model, the probabilistic visual cryptography model. The connection between the probabilistic and the deterministic models have been explored.

In this paper we show that there is a strict relation between the random grid model and the deterministic model. More specically we show that to any random grid scheme corresponds a deterministic scheme and viceversa. This allows us to use results known in a model also in the other model. In fact, the random grid model is equivalent to the probabilistic model with no pixel expansion. Exploiting the (many) results known in the deterministic model we are able to improve several schemes and to provide many upper bounds for the random grid model. Exploiting some results known for the random grid model, we are also able to provide new schemes for the deterministic model. A side eect of this paper is that future new results for any one of the two models (random grid and deterministic) should not ignore, and in fact be compared to, the results known in the other model.

22:17 [Pub][ePrint]

Cryptography is generally used to protect sensitive data from an untrusted server. In this paper, we investigate the converse question: can we use cryptography to protect a trusted server from untrusted data?

As a first step in this direction, we propose the notion of safe enclosures. Intuitively, a safe enclosure is a cryptographic primitive that encapsulates data in a way that allows to perform some computation on it, while at the same time protecting the server from malicious data. Furthermore, a safe enclosure should come equipped with a dedicated protocol that implements the enclosing function with unconditional integrity. Otherwise, unguarded data may reach the server. We discuss the novelty of these concepts, propose their formal definition and show several realizations.

22:17 [Pub][ePrint]

RFID authentication protocols should have a secret updating phase in order to protect the privacy of RFID tags against tag tracing attacks. In the literature, there are many lightweight RFID authentication protocols that try to provide key updating with lightweight cryptographic primitives. In this paper, we analyse the security of two recently proposed lightweight RFID authentication protocol against de-synchronization attacks. We show that secret values shared between the back-end server and any given tag can be easily desynchronised. This weakness stems from the insufficient design of these protocols.

22:17 [Pub][ePrint]

Smooth Projective Hash Functions are one of the base tools to build

interactive protocols; and this notion has lead to the construction of numerous protocols enjoying strong security notions, such as the security in the Bellare-Pointcheval-Rogaway (BPR) model or even Universal Composability (UC).

Yet, the construction of SPHF has been almost limited to discrete-logarithm or pairing type assumptions up to now. This stands in contrast with domains such as homomorphic encryption or functional encryption, where Lattice Based Cryptography has already caught up and overtook discrete-log/pairing based cryptography. So far, work in the direction of UC based on lattices is almost restricted to a paper from Peikert, Vaikuntanathan, and Waters (Crypto 2008) dealing with Oblivious Transfer in the UC framework, and work in the direction of password-authenticated key exchange protocols (PAKE) to one from

Katz and Vaikuntanathan (Asiacrypt 2009) on a 3-round Password-Authenticated Key Exchange, but restraining itself to the BPR model. It seems that dealing with errors in those contexts is not as easy as it is for encryption.

In this work, we identify the problem at its source, namely, the lattice version of Diffie-Hellman key exchange protocol: the key greement is only approximate. We explicit a simple folklore trick to obtain true, errorless, one-round key exchange from LWE. We then show that this trick can be adapted to various lattice encryption schemes, leading, with some technicalities, to errorless SPHF\'s. From there, we derive three new results, namely the first lattice-based following protocols: a one-round PAKE secure in the BPR model, a 3-round PAKE secure in the UC model, and a UC commitment scheme, all of them based on SIS and LWE assumptions.

22:17 [Pub][ePrint]

We construct the first leakage resilient variants of fully homomorphic encryption (FHE) schemes. Our leakage model is bounded adaptive leakage resilience. We first construct a leakage- resilient leveled FHE scheme, meaning the scheme is both leakage resilient and homomorphic for all circuits of depth less than some pre-established maximum set at the time of key generation. We do so by applying ideas from recent works analyzing the leakage resilience of public key encryption schemes based on the decision learning with errors (DLWE) assumption to the Gentry, Sahai and Waters ([17]) leveled FHE scheme. We then move beyond simply leveled FHE, removing the need for an a priori maximum circuit depth, by presenting a novel way to combine schemes. We show that by combining leakage resilient leveled FHE with multi-key FHE, it is possible to create a leakage resilient scheme capable of homomorphically evaluating circuits of arbitrary depth, with a bounded number of distinct input ciphertexts.

22:17 [Pub][ePrint]

XCB is a tweakable enciphering scheme (TES) which was first proposed in 2004. The scheme was modified in 2007. We call these

two versions of XCB as XCBv1 and XCBv2 respectively. XCBv2 was later proposed as a standard for encryption of sector oriented

storage media in IEEE-std 1619.2 2010. There is no known proof of security for XCBv1 but the authors provided a concrete security bound for XCBv2 and

a \"proof\" for justifying the bound. In this paper we show that XCBv2 is not secure as a TES by showing an easy distinguishing attack on it.

For XCBv2 to be secure, the message space should contain only messages whose lengths are multiples of the block length of the block cipher.

For such restricted message spaces also the bound that the authors claim is not justified. We show this by pointing out some errors in the proof.

We provide a new security bound for XCBv2, and this bound is much worse than that has been claimed by the authors. We also for the first time

provide a concrete security bound for XCBv1. The new bounds shows that both XCBv1 and XCBv2 are worse in terms of security compared

to all TES for which a concrete security bound is known.

22:17 [Pub][ePrint]

A reputation system for a set of entities is essentially a list of scores that provides a measure of the reliability of each entity in the set. The score given to an entity can be interpreted (and in the reputation system literature it often is~\\cite{FRS}) as the probability that an entity will behave honestly. In this paper, we ask whether or not it is possible to utilize reputation systems for carrying out secure multiparty computation. We provide formal definitions of secure computation in this setting, and carry out a theoretical study of feasibility. We present almost tight results showing when it is and is not possible to achieve \\emph{fair} secure computation in our model. We suggest applications for our model in settings where some information about the honesty of other parties is given. This can be preferable to the current situation where either an honest majority is arbitrarily assumed, or a protocol that is secure for a dishonest majority is used and the efficiency and security guarantees (including fairness) of an honest majority are not obtained.

22:17 [Pub][ePrint]

This paper analyzes the cryptographic security of J3Gen, a

promising pseudo random number generator for low-cost passive RFID

tags. Although J3Gen has been shown to fulfill the randomness

criteria set by the EPCglobal Gen2 standard and is intended for

security applications, we describe here two cryptanalytic attacks

which question its security claims: i) a probabilistic attack

based on solving linear equation systems, and ii) a

deterministic attack based on the output sequence decimation.

Numerical results, supported by simulations, show that for the

specific recommended values of the configurable parameters, a low

number of intercepted output bits are enough to crytanalyze J3Gen.

We then make some recommendations which address these issues.

22:17 [Pub][ePrint]

Research papers on new secure multi-party computation protocols

rarely confirm the need for the developed protocol with its end users.

One challenge in the way of such validation is that it is hard to explain

the benefits of secure multi-party computation to non-experts.

We present a method that we used to explain the application

models of secure multi-party computation to a diverse group of end users

in several professional areas. In these interviews, we learned that

the potential users were curious about the possibility of using

secure multi-party computation to share and statistically analyse

private data. However, they also had concerns on how the new

technology will change the data analysis processes.

Inspired by this, we implemented a secure multi-party

computation prototype that calculates statistical functions in the same way as

popular data analysis packages like R, SAS, SPSS and Stata.

Finally, we validated the practical feasibility of this application by conducting

an experimental study that combined tax records with education records.

22:17 [Pub][ePrint]

We study the complexity of secure computation in the tamper-proof hardware token model. Our main focus is on non-interactive unconditional two-party computation using bit-OT tokens, but we also study computational security with stateless tokens that have more complex functionality. Our results can be summarized as follows:

- There exists a class of functions such that the number of bit-OT tokens required to securely implement them is at least the size of the sender\'s input. The same applies for receiver\'s input size (with a different class of functionalities).

- Non-adaptive protocols in the hardware token model imply efficient (decomposable) randomized encodings. This can be interpreted as evidence to the impossibility of non-adaptive protocols for a large class of functions.

- There exists a functionality for which there is no protocol in the stateless hardware token model accessing the tokens at most a constant number of times, even when the adversary is computationally bounded.

En route to proving our results, we make interesting connections between the hardware token model and well studied notions such as OT hybrid model, randomized encodings, and obfuscation.