International Association for Cryptologic Research

# IACR News Central

You can also access the full news archive.

Further sources to find out about changes are CryptoDB, ePrint RSS, ePrint Web, Event calender (iCal).

2013-01-30
01:17 [Pub][ePrint]

A protocol for NFC mobile authentication and transaction is recently proposed by W. Chen et al. This protocol is used for micropayments, where the Mobile Network Operator (MNO) pays for its customers. The main advantage of this protocol is its compatibility with the existing GSM network. This paper suggests some improvements in this protocol from security point of view. As this protocol is used for monetary transactions, it should be as secure as possible. This paper presents an improved version of the existing protocol with a detailed analysis at the end. The user interaction with the system is improved making

it more user friendly. An additional layer of security has been added by introducing PIN authentication by the user. Mutual authentication is improved by adding freshness by the mobile device in order to resist replay attack. We also add digital signatures with the transaction messages for data integrity and non-repudiation.

01:17 [Pub][ePrint]

We extend the fully homomorphic encryption scheme over the integers of van Dijk et al. (DGHV) to batch fully homomorphic encryption, i.e. to a scheme that supports encrypting and homomorphically processing a vector of plaintext bits as a single ciphertext. Our variant remains semantically secure under the (error-free) approximate GCD problem. We also show how to perform arbitrary permutations on the underlying plaintext vector given the ciphertext and the public key. Our scheme offers competitive performance: we describe an implementation of the fully homomorphic evaluation of AES encryption, with an amortized cost of about 12 minutes per AES ciphertext on a standard desktop computer; this is comparable to the timings presented by Gentry et al. at Crypto 2012 for their implementation of a Ring-LWE based fully homomorphic encryption scheme.

01:17 [Pub][ePrint]

This article proposes a provably secure aggregate signcryption scheme in random oracles. Security of the scheme is based on computational infesibility of solving Decisional Bilinear Diffie-Hellman Problem and Discrete Logarithm Problems. Confidentiality

and authenticity are two fundamental security requirement of Public key Cryptography. These are achieved by encryption scheme and digital signatures respectively. Signcryption scheme is a cryptographic primitive that performs signature and encryption simultaneously in a single logical steps. An aggregate signcryption scheme can be constructed of the aggregation of individual signcryption. The aggreagtion is done taking n distinct signcryptions

on n messages signed by n distinct users.

01:17 [Pub][ePrint]

In a {verifiable data streaming} protocol, the client streams a long

string to the server who stores it in its database. The stream is verifiable

in the sense that the server can neither change the order of the elements

nor manipulate them. The client may also retrieve data

from the database and update them. The content of the database is

publicly verifiable such

that any party in possession of some value $s$ and a proof $\\pi$

can check that $s$ is indeed in the database.

We introduce the notion of verifiable data streaming

and present an efficient instantiation that supports an exponential number of values

based on general assumptions. Our main

technique is an authentication tree in which the leaves are not fixed in

advanced such that the user, knowing some trapdoor, can authenticate

a new element on demand \\emph{without} pre- or re-computing all other

leaves. We call this data structure \\emph{chameleon authentication tree} (CAT).

We instantiate our scheme with primitives that are secure under the

discrete logarithm assumption. The algebraic properties of this

assumption allow us to obtain a very efficient verification algorithm.

As a second application of CATs, we present a new transformation

from any one-time to many-time signature scheme

that is more efficient than previously known solutions.

01:17 [Pub][ePrint]

Lattice-based cryptography is one of the candidates in the area of post-quantum cryptography. Cryptographic schemes with security reductions to hard lattice problems (like the Shortest Vector Problem SVP) offer an alternative to recent number theory-based schemes. In order to guarantee asymptotic efficiency, most lattice-based schemes are instantiated using polynomial rings over integers. These lattices are called \'ideal lattices\'. It is assumed that the hardness of lattice problems in lattices over integer rings remains the same as in regular lattices. In order to prove or disprove this assumption, we instantiate random ideal lattices that allow to test algorithms that solve SVP and its approximate version. The Ideal Lattice Challenge allows online submission of short vectors to enter a hall of fame for full comparison. We adjoin a set of first experiments and a first comparison of ideal and regular lattices.

01:17 [Pub][ePrint]

Recently, a few CCA2-secure (IND-CCA2) variant of the McEliece cryptosystem in the standard model were introduced. All these

schemes are based on Rosrn-Segev approach and lossy trapdoor function

and utilize k-repetition paradigm. The main drawback of these chemes is that they are need additional encryption and have large key size compared to the original scheme, which intricate the public-key size problem in the code-based cryptosystem. Furthermore, full CCA2-security of these schemes achieved by using a strongly unforgeable one-time signature scheme, and so, the resulting scheme need separate encryption. Therefore,‎ the proposed schemes are not efficient.‎

In this manuscript, we propose a new and efficient IND-CCA2 variant of the McEliece cryptosystem in the standard model. The main novelty is that, unlike previous approaches, our approach is a generic ransformation and can be applied to any code-based one-way cryptosystem (both the McEliece and the Niederreiter cryptosystems). Our approach also leads to the elimination of the encryption repetition and using strongly unforgeable one-time signature scheme. This novel approach is more efficient, the publick/secret keys are as in the original scheme and the encryption/decryption complexity are comparable to the original scheme.‎ CCA2-security of the proposed scheme can be reduced in the standard model to the McEliece assumptions. To the best of our knowledge, this is the first variant of the code-based cryptosystem that is IND-CCA2 in the standard model without using k-repetition paradigm and strongly

unforgeable one-time signature scheme.‎

01:17 [Pub][ePrint]

Efficient computation of $r$-th root in $\\mathbb F_q$ has many

applications in computational number theory and many other related

areas. We present a new $r$-th root formula which generalizes

M\\\"{u}ller\'s result on square root, and which provides a possible

improvement of the Cipolla-Lehmer algorithm for general case. More

precisely, for given $r$-th power $c\\in \\mathbb F_q$, we show that

there exists $\\alpha \\in \\mathbb F_{q^r}$ such that

$Tr\\left(\\alpha^\\frac{(\\sum_{i=0}^{r-1}q^i)-r}{r^2}\\right)^r=c$

where $Tr(\\alpha)=\\alpha+\\alpha^q+\\alpha^{q^2}+\\cdots +\\alpha^{q^{r-1}}$ and $\\alpha$ is a root of certain irreducible

polynomial of degree $r$ over $\\mathbb F_q$.

01:17 [Pub][ePrint]

The central objects of secure multiparty computation are the \"multiparty functions\" (or functionalities) that it seeks to securely realize. In this article we survey a set of results that constitute a Cryptographic Complexity Theory. This theory classifies and compares multiparty functions according to their secure computability and reducibility to each other. The basic questions studied, under various notions of security and reducibility, include:

o Which functionalities are securely realizable (or are \"trivial\" - i.e., can be reduced to any functionality)?

o Which functionalities are \"complete\" - i.e., those to which any functionality can be reduced?

o More generally, which functionalities are reducible to which? Outside of triviality and completeness, this question is relatively less explored.

Reductions yield relative measures of complexity among various functionalities. In the information- theoretic setting, absolute complexity measures have also been considered. In particular, we discuss results regarding which functions have t-private protocols (in which security is required against a passive adversary corrupting t out of n players) and how this set changes as t increases from 1 to n.

We treat separately the results on two-party functionalities, for which the cryptographic complexity is much better understood. In particular, we present unified combinatorial characterizations of completeness and triviality for secure function evaluation using notions of isomorphism and the common information functionality (called the kernel) of a given functionality. Beyond completeness and triviality, we also discuss results on general reducibility, and, in the computationally bounded setting, the connection between these reductions and computational hardness assumptions.

We briefly discuss results on reactive functionalities, which are much less studied than non-reactive (secure function evaluation) functionalities. Finally, we conclude with a selection of open problems.

01:17 [Pub][ePrint]

PRINCE is a new lightweight block cipher proposed at the ASIACRYPT\'2012 conference. In this paper two observations on the linear layer of the cipher are presented. Based on the observations a differential fault attack is applied to the cipher under a random nibble-level fault model. The attack uniquely determines the 128-bit key of the cipher using less than 7 fault injections averagely. In the case with 4 fault injections, the attack limits the key to a space of size less than $2^{18}$ statistically.

2013-01-24
22:17 [Pub][ePrint]

In this paper, we present a new cube root algorithm in finite

field $\\mathbb{F}_{q}$ with $q$ a power of prime, which extends

the Cipolla-Lehmer type algorithms \\cite{Cip,Leh}. Our cube root

method is inspired by the work of M\\\"{u}ller \\cite{Muller} on

quadratic case. For given cubic residue $c \\in \\mathbb{F}_{q}$

with $q \\equiv 1 \\pmod{9}$, we show that there is an irreducible

polynomial $f(x)=x^{3}-ax^{2}+bx-1$ with root $\\alpha \\in \\mathbb{F}_{q^{3}}$ such that $Tr(\\alpha^{\\frac{q^{2}+q-2}{9}})$

is a cube root of $c$. Consequently we find an efficient cube root

algorithm based on third order linear recurrence sequence arising

from $f(x)$. Complexity estimation shows that our algorithm is

better than previously proposed Cipolla-Lehmer type algorithms.

22:17 [Pub][ePrint]

The universal composability paradigm allows for the modular design and analysis of cryptographic protocols. It has been widely and successfully used in cryptography. However, devising a coherent yet simple and expressive model for universal composability is, as the history of such models shows, highly non-trivial. For example, several partly severe problems have been pointed out in the literature for the UC model.

In this work, we propose a coherent model for universal composability, called the IITM model (Inexhaustible Interactive Turing Machine\'\'). A main feature of the model is that it is stated without a priori fixing irrelevant details, such as a specific way of addressing of machines by session and party identifiers, a specific modeling of corruption, or a specific protocol hierarchy. In addition, we employ a very general notion of runtime. All reasonable protocols and ideal functionalities should be expressible based on this notion in a direct and natural way, and without tweaks, such as (artificial) padding of messages or (artificially) adding extra messages.

Not least because of these features, the model is simple and expressive. Also the general results that we prove, such as composition theorems, hold independently of how such details are fixed for concrete applications.

Being inspired by other models for universal composability, in particular the UC model and because of the flexibility and expressivity of the IITM model, conceptually, results formulated in these models directly carry over to the IITM model.