International Association for Cryptologic Research

IACR News Central

Get an update on changes of the IACR web-page here. For questions, contact newsletter (at) You can also receive updates via:

To receive your credentials via mail again, please click here.

You can also access the full news archive.

Further sources to find out about changes are CryptoDB, ePrint RSS, ePrint Web, Event calender (iCal).

15:17 [Pub][ePrint] Updating attribute in CP-ABE: A New Approach, by Nishant Doshi and Devesh Jinwala

  In Ciphertext-Policy Attribute Based Encryption (CP-ABE), attributes are attached to the user‟s secret key and access policy is at-tached to the ciphertext. If attributes in the secret key of a user satisfy the policy then only the genuine user can decrypt the ciphertext. How-ever, such scenario also necessitates periodic updating of the secret key with the changing attributes. According to our observations, the existing attempts at doing so are not efficient. In this paper, we propose a newer approach to add, update or delete the value of particular attribute effi-ciently without the knowledge of the other attributes.

15:17 [Pub][ePrint] The low-call diet: Authenticated Encryption for call counting HSM users, by Mike Bond and George French and Nigel P. Smart and Gaven J. Watson

  We present a new mode of operation for obtaining authenticated encryption suited for use in banking and government environments where cryptographic services are only available via a Hardware Security Module (HSM) which protects the keys but offers a limited API. The practical problem is that despite the existence of better modes of operation, modern HSMs still provide nothing but a basic (unauthenticated) CBC mode of encryption, and since they mediate all access to the key, solutions must work around this. Our mode of operation makes only a single call to the HSM, yet provides a secure authenticated encryption scheme; authentication is obtained by manipulation of the plaintext being passed to the HSM via a call to an unkeyed hash function. The scheme offers a considerable performance improvement compared to more traditional authenticated encryption techniques which must be implemented using multiple calls to the HSM. Our new mode of operation is provided with a proof of security, on the assumption that the underlying block cipher used in the CBC mode is a strong pseudorandom permutation, and that the hash function is modelled as a random oracle.

15:17 [Pub][ePrint] On the immunity of Boolean functions against fast algebraic attacks using bivariate polynomial representation, by Meicheng Liu and Yin Zhang and Dongdai Lin

  In the last decade, algebraic and fast algebraic attacks are regarded as the most successful attacks on LFSR-based stream ciphers. Since the notion of algebraic immunity was introduced, the properties and constructions of Boolean functions with maximum algebraic immunity have been researched in a large number of papers. However, it is unclear whether these functions behave well against fast algebraic attacks. In this paper, we study the immunity of Boolean functions against fast algebraic attacks using bivariate polynomial representation. Based on bivariate polynomial representation, we present a sufficient and necessary condition for a Boolean function to achieve good immunity against fast algebraic attacks, propose an efficient method for estimating the immunity of a large class of Boolean functions, including the functions of Q. Jin et al., and prove that the functions of D. Tang et al. achieve (almost) optimal immunity against fast algebraic attacks.

15:17 [Pub][ePrint] Authenticity, Integrity and Proof-of-Existence for Long-Term Archiving: a Survey, by Martín A. G. Vigil and Daniel Cabarcas and Alexander Wiesmaier and Johannes Buchmann

  Digital archives that store electronic data for long periods are increasingly necessary for applications such as libraries, land registers, and medical records. Archived data is useful if protected from tampering to ensure authenticity, integrity and a date on which the existence of archived data was witnessed. We survey solutions that protected archived data using cryptography. We describe them and verify whether they successfully provide indefinite protection despite of current cryptography schemes\' limitations. Solutions are compared in regard to their goals, trust assumptions and efficiency. Based on our analysis, we elucidate open problems and promising research directions.

15:17 [Pub][ePrint] Constant Ciphertext Length in CP-ABE, by Nishant Doshi and Devesh Jinwala

  Ciphertext policy attribute based encryption (CP-ABE) is a technique

in which user with secret key containing attributes, only able to decrypt the message if the attributes in the policy match with the attributes in secret key. The existing methods that use reasonably computable decryption policies produce the ciphertext of size at least linearly varying with the number of attributes with additional pairing operations during encryption and decryption. In this paper, we propose a scheme in which ciphertext remains constant in length, irrespective of the number of attributes. Our scheme works for a threshold case: the number of attributes in a policy must be a subset of attributes in a secret key. The security of propose scheme is based on Decisional Bilinear Diffie-Hellman (DBDH) problem.

15:17 [Pub][ePrint] Privacy Amplification with Asymptotically Optimal Entropy Loss, by Nishanth Chandran and Bhavana Kanukurthi and Rafail Ostrovsky and Leonid Reyzin

  We study the problem of ``privacy amplification\'\': key agreement

between two parties who both know a weak secret w, such as a

password. (Such a setting is ubiquitous on the internet, where

passwords are the most commonly used security device.) We assume

that the key agreement protocol is taking place in the presence of

an active computationally unbounded adversary Eve. The adversary may

have partial knowledge about w, so we assume only that w has

some entropy from Eve\'s point of view. Thus, the goal of the

protocol is to convert this non-uniform secret w into a uniformly

distributed string $R$ that is fully secret from Eve. R may then

be used as a key for running symmetric cryptographic protocols (such

as encryption, authentication, etc.).

Because we make no computational assumptions, the entropy in R can

come only from w. Thus such a protocol must minimize the entropy

loss during its execution, so that R is as long as possible. The

best previous results have entropy loss of $\\Theta(\\kappa^2)$, where

$\\kappa$ is the security parameter, thus requiring the password to

be very long even for small values of $\\kappa$. In this work, we

present the first protocol for information-theoretic key agreement

that has entropy loss LINEAR in the security parameter. The

result is optimal up to constant factors. We achieve our improvement

through a somewhat surprising application of error-correcting codes

for the edit distance.

The protocol can be extended to provide also ``information

reconciliation,\'\' that is, to work even when the two parties have slightly different versions of w (for example, when biometrics are involved).

15:17 [Pub][ePrint] Are We Compromised? Modelling Security Assessment Games, by Viet Pham and Carlos Cid

  Security assessments are an integral part of organisations\' strategies for protecting their digital assets and critical IT infrastructure.

In this paper we propose a game-theoretic modelling of a particular form of security assessment -- one which addresses the question ``are we compromised?\'\'.

We do so by extending the recently proposed game ``FlipIt\'\', which itself can be used to model the interaction between defenders and attackers under the Advanced Persistent Threat (APT) scenario.

Our extension gives players the option to ``test\'\' the state of the game before making a move. This allows one to study the scenario in which organisations have the option to perform periodic security assessments of such nature, and the benefits they may bring.

15:17 [Pub][ePrint] Hierarchical Identity-Based (Lossy) Trapdoor Functions, by Alex Escala and Javier Herranz and Benoit Libert and Carla Rafols

  Lossy trapdoor functions, introduced by Peikert and Waters (STOC\'08), have received a lot of attention in the last years, because of their wide range of applications in theoretical cryptography. The notion has been recently extended to the identity-based setting by Bellare \\textit{et al.} (Eurocrypt\'12). We provide one more step in this direction, by considering the notion of hierarchical identity-based (lossy) trapdoor functions (HIB-TDFs). Hierarchical identity-based cryptography has proved very useful both for practical applications and to establish theoretical relations with other cryptographic primitives.

The notion of security for IB-TDFs put forward by Bellare \\textit{et al.} easily extends to the hierarchical scenario, but an (H)IB-TDF secure in this sense is not known to generically imply other related primitives with security against adaptive-id adversaries, not even IND-ID-CPA secure encryption. Our first contribution is to define a new security property for (H)IB-TDFs. We show that functions satisfying this property imply secure cryptographic primitives in the adaptive identity-based setting: these include encryption schemes with semantic security under chosen-plaintext attacks, deterministic encryption schemes, and (non-adaptive) hedged encryption schemes that maintain some security when messages are encrypted using randomness of poor quality. We emphasize that some of these primitives were unrealized in the (H)IB setting previous to this work.

As a second contribution, we describe the first pairing-based HIB-TDF realization. This is also the first example of hierarchical trapdoor function based on traditional number theoretic assumptions: so far, constructions were only known under lattice assumptions. Our HIB-TDF construction is based on techniques that differ from those of Bellare {\\it et al.} in that it uses a

hierarchical predicate encryption scheme as a key ingredient. The resulting HIB-TDF is proved to satisfy the new security definition, against either selective or, for hierarchies of constant depth, adaptive adversaries. In either case, we only need the underlying predicate encryption system to be selectively secure.

15:17 [Pub][ePrint] Scalable Deniable Group Key Establishment, by Kashi Neupane and Rainer Steinwandt and Adriana Suarez Corona

  The popular Katz-Yung compiler from CRYPTO 2003 can be used to transform unauthenticated group key establishment protocols into authenticated ones. In this paper we present a modication of Katz and Yung\'s construction which maintains the round complexity of their compiler, but for \"typical\" unauthenticated group key establishments adds authentication in such a way that deniability is achieved as well. As an application, a deniable authenticated group key establishment with three rounds of communication can be constructed.

15:17 [Pub][ePrint] On pseudorandomization of information-theoretically secure schemes without hardness assumptions, by Koji Nuida

  A recent work by Nuida and Hanaoka (in ICITS 2009) provided a proof technique for security of information-theoretically secure cryptographic schemes in which the random input tape is implemented by a pseudorandom generator (PRG). In this paper, we revisit their proof technique and generalize it by introducing some trade-off factor, which involves the original proof technique as a special case and provides a room of improvement of the preceding result. Secondly, we consider two issues of the preceding result; one is the requirement of some hardness assumption in their proof; another is the gap between non-uniform and uniform computational models appearing when transferring from the exact security formulation adopted in the preceding result to the usual asymptotic security. We point out that these two issues can be resolved by using a PRG proposed by Impagliazzo, Nisan and Wigderson (in STOC 1994) against memory-bounded distinguishers, instead of usual PRGs against time-bounded distinguishers. We also give a precise formulation of a computational model explained by Impagliazzo et al., and by using this, perform a numerical comparison showing that, despite the significant advantage of removing hardness assumptions, our result is still better than, or at least competitive to, the preceding result from quantitative viewpoints. The results of this paper would suggest a new motivation to use PRGs against distinguishers with computational constraints other than time complexity in practical situations rather than just theoretical works.

15:17 [Pub][ePrint] Succinct Malleable NIZKs and an Application to Compact Shuffles, by Melissa Chase and Markulf Kohlweiss and Anna Lysyanskaya and Sarah Meiklejohn

  Depending on the application, malleability in cryptography can be viewed as either a flaw or -- especially if sufficiently understood and restricted -- a feature. In this vein, Chase, Kohlweiss, Lysyanskaya, and Meiklejohn recently defined malleable zero-knowledge proofs, and showed how to control the set of allowable transformations on proofs. As an application, they construct the first compact verifiable shuffle, in which one such controlled-malleable proof suffices to prove the correctness of an entire multi-step shuffle.

Despite these initial steps, a number of natural open problems remain: (1) their construction of controlled-malleable proofs relies on the inherent malleability of Groth Sahai proofs and is thus not based on generic primitives; (2) the classes of allowable transformations they can support are somewhat restrictive; and (3) their construction of a compactly verifiable shuffle has proof size O(N^2 + L) (where N is the number of votes and L is the number of mix authorities), whereas in theory such a proof could be of size O(N + L).

In this paper, we address these open problems by providing a generic construction of controlled- malleable proofs using succinct non-interactive arguments of knowledge, or SNARGs for short. Our construction has the advantage that we can support a very general class of transformations (as we no longer rely on the transformations that Groth-Sahai proofs can support), and that we can use it to obtain a proof of size O(N + L) for the compactly verifiable shuffle.