International Association for Cryptologic Research

IACR News Central

Get an update on changes of the IACR web-page here. For questions, contact newsletter (at) iacr.org. You can also receive updates via:

You can also access the full news archive.

Further sources to find out about changes are CryptoDB, ePrint RSS, ePrint Web, Event calender (iCal).

2013-11-03
22:17 [Pub][ePrint]

This paper introduces a new obfuscation called obfuscation of encrypted blind signature. Informally, encrypted blind signature enable the message is blind to signer, and she couldn\'t distinguish two encrypted signatures by herself. A obfuscation of encrypted blind signature makes the process of encrypted blind signature unintelligible for any third party, while still keeps the original encrypted blind signature functionality. We use schnorr\'s blind signature scheme and linear encryption scheme as blocks to construct a new obfuscator. Moreover, we propose two new security definition: blindness w.r.t encrypted blind signature (EBS) obfuscator and one-more unforgeability(OMU) w.r.t EBS obfuscator, and prove them based on Decision Liner Diffie-Hellman(DL) assumption and the hardness of discrete logarithm, respectively. We also demonstrate that our obfuscator satisfies the Average-Case Virtual Black-Box Property(ACVBP) property w.r.t dependent oracle, it is indistinguishable secure. Our paper expand a new direction for the application of obfuscation.

22:17 [Pub][ePrint]

Side-Channel Attacks (SCA) are considered a serious threat against embedded cryptography. Therefore security critical chips must be tested for SCA resistance before deployment or certification. SCA are powerful but can need a lot of computation power, especially in the presence of countermeasures. The computation complexity of these attacks can be reduced by selecting a small subset of points where leakage prevails. In this paper, we propose a method to detect relevant leakage points in side-channel traces. The method is based on Normalized Inter-Class Variance (NICV). A key advantage of NICV over state-of-the-art is that NICV does neither need a clone device nor the knowledge of secret parameters of the crypto-system. NICV has a low computation requirement and it detects leakage using public information like input plaintexts or output ciphertexts only. It can also be used to test the efficiency of leakage models, the quality of traces and robustness of countermeasures. A theoretical rationale of NICV with practical application on real crypto-systems are provided to support our claims.

22:17 [Pub][ePrint]

Public key exchange protocol is identified as an important application in the field of public-key cryptography. Most of the existing public key exchange schemes are Diffie-Hellman (DH)-type, whose security is based on DH problems over different groups. Note that there exists Shor\'s polynomial-time algorithm to solve these DH problems when a quantum computer is available, we are therefore motivated to seek for a non-DH-type and quantum resistant key exchange protocol. To this end, we turn our attention to lattice-based cryptography. The higher methodology behind our roadmap is that in analogy to the link between ElGamal, DSA, and DH, one should expect a NTRU lattice-based key exchange primitive in related to NTRU-ENCRYPT and NTRU-SIGN. However, this excepted key exchange protocol is not presented yet and still missing. In this paper, this missing key exchange protocol is found, hereafter referred to as NTRU-KE, which is studied in aspects of security and key-mismatch failure. In comparison with ECDH (Elliptic Curve-based Diffie-Hellman), NTRU-KE features faster computation speed, resistance to quantum attack, and more communication overhead. Accordingly, we come to the conclusion that NTRU-KE is currently comparable with ECDH. However, decisive advantage of NTRU-KE will occur when quantum computers become a reality.

22:17 [Pub][ePrint]

The security of public-key encryption (PKE), a widely-used cryptographic primitive, has received much attention in the cryptographic literature. Many security notions for PKE have been proposed, including several versions of CPA-security, CCA-security, and non-malleability. These security notions are usually defined in terms of a certain game that an efficient adversary cannot win with non-negligible probability or advantage.

If a PKE scheme is used in a larger protocol, then the security of this protocol is proved by showing a reduction of breaking a certain security property of the PKE scheme to breaking the security of the protocol. A major problem is that each protocol requires in principle its own tailor-made security reduction. Moreover, which security notion of the PKE should be used in a given context is a priori not evident; the employed games model the use of the scheme abstractly through oracle access to its algorithms, and the sufficiency for specific applications is neither explicitly stated nor proven.

In this paper we propose a new approach to investigating the application of PKE, following the constructive cryptography paradigm of Maurer and Renner (ICS~2011). The basic use of PKE is to enable confidential communication from a sender A to a receiver B, assuming A is in possession of B\'s public key. One can distinguish two relevant cases: The (non-confidential) communication channel from A to B can be authenticated (e.g., because messages are signed) or non-authenticated. The application of PKE is shown to provide the construction of a secure channel from A to B from two (assumed) authenticated channels, one in each direction, or, alternatively, if the channel from A to B is completely insecure, the construction of a confidential channel without authenticity. Composition then means that the assumed channels can either be physically realized or can themselves be constructed cryptographically, and also that the resulting channels can directly be used in any applications that require such a channel. The composition theorem shows that several construction steps can be composed, which guarantees the soundness of this approach and eliminates the need for separate reduction proofs.

We also revisit several popular game-based security notions (and variants thereof) and give them a constructive semantics by demonstrating which type of construction is achieved by a PKE scheme satisfying which notion. In particular, the necessary and sufficient security notions for the above two constructions to work are CPA-security and a variant of CCA-security, respectively.

19:17 [Pub][ePrint]

Extractability, or \"knowledge,\" assumptions (such as the \"knowledge-of-exponent\" assumption) have recently gained popularity in the cryptographic community--leading to the study of primitives such as extractable one-way functions, extractable hash functions, succinct non- interactive arguments of knowledge (SNARKs), and extractable obfuscation, and spurring the development of a wide spectrum of new applications relying on these primitives. For most of these applications, it is required that the extractability assumption holds even in the presence of attackers receiving some auxiliary information that is sampled from some fixed efficiently computable distribution Z.

We show that, assuming the existence of collision-resistant hash functions, there exists a pair of efficient distributions Z, Z\'; such that either:

o extractable one-way functions w.r.t. Z do not exist, or

o extractability obfuscations for Turing machines w.r.t. Z do not exist.

A corollary of this result shows that assuming existence of fully homomorphic encryption with decryption in NC1, there exist efficient distributions Z, Z\' such that either

o extractability obfuscations for NC1 w.r.t. Z do not exist, or

o SNARKs for NP w.r.t. Z\' do not exist.

To achieve our results, we develop a \"succinct punctured program\" technique, mirroring the powerful \"punctured program\" technique of Sahai and Waters (ePrint\'13), and present several other applications of this new technique.

19:17 [Pub][ePrint]

This paper defines adaptive soundness (AS) security for witness encryption and applies it to provide the first non-invasive schemes for asymmetric password-based encryption (A-PBE). A-PBE offers significant gains over classical, symmetric password-based encryption (S-PBE) in the face of attacks that compromise servers to recover hashed passwords. We also show by counter-example that the original soundness security (SS) requirement of GGSW does not suffice for the security of their own applications, and show that AS fills the gap.

19:17 [Pub][ePrint]

We describe a method to perform scalar multiplication on two classes of

ordinary elliptic curves, namely $E: y^2 = x^3 + Ax$ in prime characteristic

$p\\equiv 1$ mod~4, and $E: y^2 = x^3 + B$ in prime characteristic $p\\equiv 1$

mod 3. On these curves, the 4-th and 6-th roots of unity act as (computationally

efficient) endomorphisms. In order to optimise the scalar multiplication, we consider a width-$w$-NAF (non-adjacent form) digit expansion of positive integers to the complex base of $\\tau$, where $\\tau$ is a zero of the characteristic polynomial $x^2 - tx + p$ of the Frobenius endomorphism associated to the curve. We provide a precomputationless algorithm by means of a convenient factorisation of the unit group of residue classes modulo $\\tau$ in the endomorphism ring, whereby we construct a digit set consisting of powers of subgroup generators, which are chosen as efficient endomorphisms of the curve.

19:17 [Pub][ePrint]

Evaluating side-channel attacks and countermeasures requires determining the amount of information leaked by a target device. For this purpose, information extraction procedures published so far essentially combine a \"leakage model\" with a \"distinguisher\". Fair evaluations ideally require exploiting a perfect leakage model (i.e. exactly corresponding to the true leakage distribution) with a Bayesian distinguisher. But since such perfect models are generally unknown, density estimation techniques have to be used to approximate the leakage distribution. This raises the fundamental problem that all security evaluations are potentially biased by both estimation and assumption errors. Hence, the best that we can hope is to be aware of these errors. In this paper, we provide and implement methodological tools to solve this issue. Namely, we show how sound statistical techniques allow both quantifying the leakage of a chip, and certifying that the amount of information extracted is close to the maximum value that would be obtained with a perfect model.

19:17 [Pub][ePrint]

We present a polynomial-time reduction of the discrete logarithm problem in any periodic (a.k.a. torsion)

semigroup (SGDLP) to the same problem in a subgroup of the same semigroup.

It follows that SGDLP can be solved in polynomial time by quantum computers, and that

SGDLP has subexponential algorithms whenever the classic DLP in the corresponding groups has subexponential algorithms.

19:17 [Pub][ePrint]

We revisit the classical problem of converting an imperfect source of randomness into a usable cryptographic key. Assume that we have some cryptographic application $P$ that expects a uniformly random $m$-bit key $R$ and ensures that the best attack (in some complexity class) against $P(R)$ has success probability at most $\\delta$. Our goal is to design a key-derivation function (KDF) $h$ that converts any random source $X$ of min-entropy $k$ into a sufficiently good\'\' key $h(X)$,

guaranteeing that $P(h(X))$ has comparable security $\\delta\'$ which is close\' to $\\delta$.

Seeded randomness extractors provide a generic way to solve this problem for \\emph{all} applications $P$, with resulting security $\\delta\' = O(\\delta)$, provided that we start with entropy $k\\ge m+2\\logdel-O(1)$. By a result of Radhakrishnan and Ta-Shma, this bound on $k$ (called the RT-bound\'\') is also known to be tight in general. Unfortunately, in many situations the loss of $2\\logdel$ bits of entropy is unacceptable. This motivates the study KDFs with less entropy waste by placing some restrictions on the source $X$ or the application $P$.

In this work we obtain the following new positive and negative results in this regard:

- Efficient samplability of the source $X$ does not help beat the RT-bound for general applications. This resolves the SRT (samplable RT) conjecture of Dachman-Soled et al.~\\cite{DGKM12} in the affirmative, and also shows that the existence of computationally-secure extractors beating the RT-bound implies the existence of one-way functions.

- We continue in the line of work initiated by Barak et al. \\cite{BDK+11} and construct new information-theoretic KDFs which beat the RT-bound for large but restricted classes of applications. Specifically, we design efficient KDFs that work for all unpredictability applications $P$ (e.g., signatures, MACs, one-way functions, etc.) and can either: (1) extract \\emph{all} of the entropy $k = m$ with a very modest security loss $\\delta\'=O(\\delta\\cdot \\log (1/\\delta))$, or alternatively, (2) achieve essentially optimal security $\\delta\' = O(\\delta)$ with a very modest entropy loss $k \\ge m+\\log\\log(1/\\delta)$. In comparison, the best prior results from \\cite{BDK+11} for this class of applications would only guarantee $\\delta\'=O(\\sqrt{\\delta})$ when $k=m$, and would need $k\\ge m+\\logdel$ to get $\\delta\'=O(\\delta)$.

- The weaker bounds of \\cite{BDK+11} hold for a larger class of so-called square-friendly\'\' applications (which includes all unpredictability, but also some important indistinguishability, applications). Unfortunately, we show that these weaker bounds are tight for the larger class of applications.

- We abstract out a clean, information-theoretic notion of $(k,\\delta,\\delta\')$-unpredictability extractors, which guarantee `induced\'\' security $\\delta\'$ for any $\\delta$-secure unpredictability application $P$, and characterize the parameters achievable for such unpredictability extractors. Of independent interest, we also relate this notion to the previously-known notion of (min-entropy) condensers, and improve the state-of-the-art parameters for such condensers.

19:17 [Pub][ePrint]

We demonstrate how to construct statistical zero-knowledge authentication protocols for smart cards based on general assumptions. The main protocol is only secure against active attacks, but we present a modification based on trapdoor commitments which can resist concurrent attacks as well. Both protocols are instantiated using lattice-based constructs, which are secure against quantum attacks. We illustrate the practicality of our protocols on smart cards in terms of storage, computation, communication, and round complexity, and it is shown that our protocols are superior to known lattice-based zero-knowledge protocols in these aspects.