International Association for Cryptologic Research

# IACR News Central

You can also access the full news archive.

Further sources to find out about changes are CryptoDB, ePrint RSS, ePrint Web, Event calender (iCal).

2012-08-05
18:17 [Pub][ePrint]

Adaptively secure multiparty computation is an essential and fundamental notion in cryptography. In this work we focus on the basic question of constructing a multiparty computation protocol secure against a \\emph{malicious}, \\emph{adaptive} adversary in the \\emph{stand-alone} setting without assuming an honest majority, in the plain model. It has been believed that this question can be resolved by composing known protocols from the literature. We show that in fact, this belief is fundamentally mistaken. In particular, we show:

\\begin{itemize}

\\item[-]\\textbf{Round inefficiency is unavoidable when using black-box simulation:} There does not exist any $o(\\frac{n}{\\log{n}})$ round protocol that adaptively securely realizes a (natural) $n$-party functionality with a black-box simulator. Note that most previously known protocols in the adaptive security setting relied on black-box simulators.

\\item[-]\\textbf{A constant round protocol using non-black-box simulation:} We construct a \\emph{constant round} adaptively secure multiparty computation protocol in a setting without \\emph{honest majority} that makes crucial use of non-black box techniques.

\\end{itemize}

Taken together, these results give the first resolution to the question of adaptively secure multiparty computation protocols with a malicious dishonest majority in the plain model, open since the first formal treatment of adaptive security for multiparty computation in 1996.

18:17 [Pub][ePrint]

Group signatures are a central cryptographic primitive where users

can anonymously and accountably sign messages in the name of a group

they belong to. Several efficient constructions with security proofs

in the standard model ({\\it i.e.}, without the random oracle idealization) appeared in the recent years. However, like standard PKIs, group signatures need an efficient revocation system to be practical.

Despite years of research, membership revocation remains a non-trivial problem: many existing solutions do not scale well due to

either high overhead or constraining operational requirements (like

the need for all users to update their keys after each revocation). Only recently, Libert, Peters and Yung (Eurocrypt\'12) suggested a new

scalable revocation method, based on the Naor-Naor-Lotspiech (NNL)

broadcast encryption framework, that interacts nicely with techniques

for building group signatures in the standard model. While promising, their mechanism introduces important storage requirements at group members. Namely, membership certificates, which used to have constant size in existing standard model constructions, now have polylog size in the maximal cardinality of the group (NNL, after all, is a tree-based technique and such dependency is naturally expected).

In this paper we show how to obtain private keys of {\\it constant}

size. To this end, we introduce a new technique to leverage the NNL

subset cover framework in the context of group signatures but,

perhaps surprisingly, without logarithmic relationship between the

size of private keys and the group cardinality. Namely, we provide a

way for users to efficiently prove their membership of one of the

generic subsets in the NNL subset cover framework. This technique

makes our revocable group signatures competitive with ordinary group

signatures ({\\it i.e.}, without revocation) in the standard model. Moreover, unrevoked members (as in PKIs) still do not need to update their keys at each revocation.

15:17 [Pub][ePrint]

TWINE is a lightweight block cipher proposed at ECRYPT Workshop on Lightweight Cryptography 2011, Belgium.

The cipher consists of 36 rounds and has two versions TWINE-80 and TWINE-128 supporting key lengths of 80 and 128 bits, respectively.

The block length of the two versions is 64-bit.

In this paper, we present the first single-key attacks on the both versions of the cipher.

In these attacks, we use the recently developed biclique technique.

The complexities of the attacks on TWINE-80 and TWINE-128 are $2^{79.10}$ and $2^{126.82}$ respectively and the data requirement for the two attacks is $2^{60}$.

15:17 [Pub][ePrint]

We present the notion of PROG-KDM security for public-key encryption

schemes. This security notion captures both KDM security and

revealing of secret keys (key corruptions) in a single

definition. This is achieved by requiring the existence of a

simulator that can program ciphertexts when a secret key is

revealed, i.e., the simulator can delay the decision what plaintext

is contained in what ciphertext to the moment where the ciphertext

is opened. The definition is formulated in the random oracle model.

We show that PROG-KDM security can be achieved by showing that a

natural and practical construction in the ideal cipher model is

PROG-KDM secure (hybrid encryption using authenticated CBC

encryption).

15:17 [Pub][ePrint]

Group signatures are a central cryptographic primitive, simultaneously supporting accountability and anonymity. They allow

users to anonymously sign messages on behalf of a group they are

members of. The recent years saw the appearance of several

constructions with security proofs in the standard model ({\\it

i.e.}, without appealing to the random oracle heuristic). For a

digital signature scheme to be adopted, an efficient revocation

scheme (as in regular PKI) is absolutely necessary.

Despite over a decade of extensive research, membership revocation

remains a non-trivial problem in group signatures: all existing

solutions are not truly scalable due to either high overhead (e.g., large group public key size), or limiting operational requirement (the need for all users to follow the system\'s entire history). In the standard model, the situation is even worse as many existing solutions are not readily adaptable. To fill this gap and tackle this challenge, we describe a new revocation approach based, perhaps

somewhat unexpectedly, on the Naor-Naor-Lotspiech framework which was introduced for a different problem (namely, that of broadcast encryption). Our mechanism yields efficient and scalable revocable group signatures in the standard model. In particular, the size of signatures and the verification cost are independent of the number of

revocations and the maximal cardinality $N$ of the group while other

complexities are at most polylogarithmic in $N$. Moreover, the

schemes are history-independent: unrevoked group members do not have

to update their keys when a revocation occurs.

15:17 [Pub][ePrint]

The core of the 3rd Generation Partnership Project (3GPP) encryption standard 128-EEA3 is a stream cipher called ZUC. It was designed by the Chinese Academy of Sciences and proposed for inclusion in the cellular wireless standards called \"Long Term Evolution\" or \"4G\". The LFSR-based cipher uses a 128-bit key. In this paper, we first show timing attacks on ZUC that can recover, with about 71.43% success rate, (i) one bit of the secret key immediately, and (ii) information involving 6 other key bits. The time, memory and data requirements of the attacks are negligible. While we see potential improvements to the attacks, we

also suggest countermeasures.

15:17 [Pub][ePrint]

A commonly used metric for comparing the resilience of key predistribution schemes is $\\fail_s$, which measures the proportion of network connections which are broken\' by an adversary which has compromised $s$ nodes. In Random key predistribution schemes for sensor networks\', Chan, Perrig and Song present a formula for measuring the resilience in a class of random key predistribution schemes called $q$-composite schemes. We present a correction to this formula for schemes where more than one key may be used to secure a link between a pair of nodes. Our corrected formula features an additional parameter which makes it applicable to a wider variety of random key predistribution schemes, including the original Eschenauer Gligor scheme. We also present a simplification of the formula for calculating connectivity.

We refer to the recent paper by Yum and Lee which also claims to correct the original formula for the $q$-composite scheme. However the resulting formula is complicated, computationally demanding, and hard to understand. The formula which we propose and prove is easily computable and can be applied to a wider range of schemes.

15:17 [Pub][ePrint]

Genus 2 curves with simple but not absolutely simple jacobians can be used to construct pairing-based cryptosystems more efficient than for a generic genus 2 curve. We show that there is a full analogy between methods for constructing ordinary pairing-friendly elliptic curves and simple abelian varieties, which are iogenous over some extension to a product of elliptic curves. We extend the notion of complete, complete with variable discriminant, and sparse families introduced in by Freeman, Scott and Teske for elliptic curves, and we generalize the Cocks-Pinch method and the Brezing-Weng method to construct families of each type. To realize abelian surfaces as jacobians we use of genus 2 curves of the form $y^2=x^5+ax^3+bx$ or $y^2=x^6+ax^3+b$, and apply the method of Freeman and Satoh. As applications we find some families of abelian surfaces with recorded $\\rho$-value $\\rho=2$ for embedding degrees $k=3,4,6,12$, or $\\rho=2.1$ for $k=27,54$. We also give variable-discriminant families with best $\\rho$-values.

15:17 [Pub][ePrint]

We use ideas from game theory to improve two families of authentication protocols, namely password-based and manual authentication schemes. The protocols will be transformed so that even if an intruder attacks different protocol runs between honest nodes, its expected payoff will still be lower than when it does not attack. A rational intruder, who always tries to maximise its payoff, therefore has no incentive to attack any protocol run among trustworthy parties. To illustrate the use of our method, we present a case study relating to the password-based authentication stage of on-line banking, where passwords are chosen either randomly or biasedly by, e.g., humans. For the latter we use the publicly available 32 million passwords of the social gaming network website RockYou as the source of human-selected passwords.

15:17 [Pub][ePrint]

Epsilon-biased distribution has many applications in practice, including universal hashing computation. In this paper we will improve an existing epsilon-biased distribution construction due to Alon et al. that requires to uniformly and efficiently sample irreducible polynomials of a large degree, e.g. between 80 and 160. To remove the need for such a sampling which can be computationally expensive, we will replace the irreducible polynomials by random monic polynomials of higher degree, i.e. every degree r monic polynomial whether irreducible or reducible is selected with the same probability 2^{-r}. To analyse the security of the scheme, we need to

find the maximum number of degree r polynomials that divide a degree n polynomial where n > r.

15:17 [Pub][ePrint]

Secure electronic circuits must implement countermeasures against a wide range of attacks.

Often, the protection against side channel attacks requires to be tightly integrated within the functionality to be protected.

It is now part of the designer\'s job to implement them.

But this task is known to be error-prone, and with current development processes, countermeasures are evaluated often very late (at circuit fabrication).

In order to improve the confidence of the designer in the efficiency of the countermeasure,

we suggest in this article to resort to formal methods early in the design flow for two reasons.

First of all, we intend to check that the process of transformation of the design from the vulnerable description to the protected one does not alter the functionality.

Second, we wish to prove that the security properties (that can derive from a formal security functional specification) are indeed met after transformation.

Our first contribution is to show how such a framework can be setup (in COQ) for netlist-level protections.

The second contribution is to illustrate that this framework indeed allows to detect vulnerabilities in dual-rail logics.